0

International Conference on Computer and Computer Intelligence (ICCCI 2011)

  • Author(s)/Editor(s):
  • Published:
    2011
  • DOI:
    10.1115/1.859926
Description | Details

Proceedings of the International Conference on Computer and Computational Intelligence, December 2–3, 2011, Bangkok, Thailand. ICCCI is the primary annual computer and computational intelligence conference aimed at presenting current research from scientists, scholars, engineers and students from universities and industries around the world, and hence to foster research relationships among them. This conference provides opportunities for delegates to exchange new ideas and experiences face to face, to establish business or research relationships and to find global partners for future collaboration.

  • Copyright:
    All rights reserved. Printed in the United States of America. Except as permitted under the United States Copyright Act of 1976, no part of this publication may be reproduced or distributed in any form or by any means, or stored in a database or retrieval system, without the prior written permission of the publisher. ©  2011  ASME
  • ISBN:
    9780791859926
  • No. of Pages:
    740
  • Order No.:
    859926
Front Matter PUBLIC ACCESS
PDF
  • Session 1: Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      This article shows the importance of Bayesian classifiers for prediction in data mining, also as important components such as coverage and accuracy may improve the classification performance in themselves an analysis by performing a mathematical model such as Naive Bayes can be improved by adding coverage and precision. Finally, we believe that this improvement may be useful in many types of applications, so this application can serve as a support tool for research on breast cancer and as a decision making in the allocation of resources for prevention and treatment, also can also be used in previous applications to be improved in many ways.
    • Show Sections

      Hide Sections

      With state grid corporation of china building unified strong smart grid, intelligent power consumption field is developing rapidly, especially the intelligent community, which is focused on as a typical application in the intelligent power consumption field in the paper. In paper, in order to use the data produced in the extending process of intelligent community reasonably, it presents the conception and architecture of business intelligence, which is introduced in intelligent community, and establishes the business intelligence system in intelligent community, then designs the system functions and applications. The system will fill the application gaps of business intelligence in intelligent community, guide users to reasonably use power, and provide decision support for power grid enterprise.
    • Show Sections

      Hide Sections

      Quality control is of crucial importance in many production lines such as car manufacturing. It shall enable detection and rejection any defective product and prevent its delivery to the market. In order to prevent high losses due to defective products, some measures must be taken. In this paper a new approach is presented that enables an effective defect root cause analysis (DRCA) in production lines for quality improvements in manufacturing system and process-engineering departments. This method includes the development of a knowledge model, based on information obtained from experience, various manufacturing data source and defect related knowledge. Due to the increasing level of complexity in production lines and various uncertainties in system parameters a hierarchical Baysian network is used. which identifies the most probable root cause of the defect. This method has been applied to the defective vehicle body surfaces in paint shop of an automotive assembly plant. Both simulation results and practical tests demonstrate the acceptable performance of the proposed method.
    • Show Sections

      Hide Sections

      Diagnostics of complex and rare medical cases lacking clear symptoms of particular abnormalities is a very challenging problem. Case based reasoning (CBR) is known to provide helpful and generic guidelines for practitioners as well as for the design of medical expert systems dealing with such non-standard diagnostic problems. Single example learning (SEL) algorithms offer more formal machine learning framework for classification of rare and novel classes. However, both CBR and SEL approaches require significant number of well-studied examples and a set of objective features capable to provide robust matching of novel examples with the previous ones. In practice, data sets for well-defined abnormalities suited for quantification with existing indicators are often limited. However, significant amount of valuable clinical information from cases labeled only as normal or abnormal without particular diagnosis remains underutilized. Recently, we have demonstrated that this information can be effectively employed to produce powerful normal-abnormal meta-classifiers using ensemble learning techniques applied to existing physiological indicators. This is achieved by an optimal weighted combination of complementary indicators which are experts in different regimes of the considered biological complex system. Therefore, partial information of wide variety of dynamical regimes becomes implicitly encoded in the obtained ensemble of classifiers, while only aggregated output is used. Extraction of this underutilized knowledge could be formalized in terms of ensemble decomposition learning (EDL) and used for representation of complex and rare cases in terms of intrinsic dynamical regimes. Such representation could prove to be more accurate and robust compared to traditional CBR and SEL approaches. Illustrative application of the EDL approach to cardiac diagnostics based on HRV analysis is presented and discussed.
    • Show Sections

      Hide Sections

      This paper discusses a Multi-Compartment Vehicle Routing Problem (MC-VRP) with Split-Patterns (MC-VRP-SP). This MC-VRP focuses on the delivery of multiple types of fuel oil from tank farms to customer locations. The main objective of this MC-VRP is to minimize the total traveling cost. Multiple types of fuel oil are loaded to the vehicles having several capacities with various capacities of compartments. Then, two mathematical models are proposed to represent this MC-VRP-SP. The original customer demand is split according to the predetermined patterns so that it can be loaded to compartments appropriately. However, the split demands must be delivered by the same truck and one compartment can support only a single split demand. Optimization approach is utilized to solve these mathematical models. The computational time is limited. The numerical examples show that the optimization approach can yield us the optimal solutions only in small size problem. For large size problem, the optimization can determine the feasible solutions within given computational time. Therefore, a saving algorithm is used to separate the customers into clusters. Then each cluster is solved separately. The results show that this clustering can improve the solution for the large size problem.
    • Show Sections

      Hide Sections

      This paper states a collaborative framework for the distributed multiobjective optimization of combinatorial problems. The proposed framework is completely agnostic to the specific specialized metaheuristic used. Thus, it is able to use different hybrid strategies using two or more metaheuristics in a collaborative fashion. Besides, the designed framework uses a central repository of nondominated solutions. The solutions are further processed in different nodes (machines) and later go back to the central repository. On the other hand, once the metaheuristic has converged to a new solution its quality is checked, and if it is a non-dominated solution then it is stored in the central repository to be used by other nodes (possibly executing a different metaheuristic) as a new starting point. Lastly, we tested the proposed framework using metrics from the specialized literature. Results show a consistent improvement of the Pareto Front as the number of nodes is increased.
    • Show Sections

      Hide Sections

      Most of the computer models can fully account for the quantitative aspect in all the hydrologic and hydraulic design stages. The aim of this paper is to evaluate the capabilities of PONDPACK model to analyse the drainage network and pond capacity to cater stormwater runoff from residential areas. Based on actual and design rainfall data, PONDPACK can generate the peak discharges for each sub-catchment for pre and post development. PONDPACK model can produce the analysis of drainage network using Rational Method and SCS hydrograph methods. Pond routing can be generated using Modified Puls Method or Muskingum Method. Result has shown PONDPACK performed well using SCS Hydrograph method compared to Modified Rational Method where it can generate the peak runoff for each sub-catchment. Besides, it also can produce flow and storage routing for each link and pond system. However, Modified Rational method can only generate peak discharge from single catchment area. Since this method is not able to generate hydrograph, it is also not applicable in performing flow routing. In 20 years ARI, detention pond capacity will achieve up to 2800 m3. PONDPACK is applicable in the design and analysis of drainage network and pond system with some modifications such as the CN values to suite with Malaysian condition.
    • Show Sections

      Hide Sections

      This paper includes design, simulation, fabrication and measured parameters of the linear series feed microstrip patch array for ISM band frequency. The proposed array antenna was consists of three equilateral triangular microstrip patches. Measured antennas characteristics were compared to each other and to point to good agreement between the theoretically and experimentally data. The bandwidth obtained with fabrication is 200MHz which is better than simulation bandwidth which is 140MHz.
    • Show Sections

      Hide Sections

      In the paper, adjoint-based error estimation and grid adaptation is presented; the procedure is based on adjoint formulation in which the estimated error in the functional can be directly related to the local residual errors of both the primal and adjoint solutions. This relationship allows local error contributions to be used as indicators in a grid adaptive method designed to produce specially tuned grid for accurately estimating the chosen functional. The method involves estimating the error in the coarse-grid functional with respect to its value on an embedded, globally-refined fine grid. The adaptive algorithm strives to improve the quality of the aforementioned error estimate by attempting to reduce and equi-distribute the remaining error in the functional after correction. Numerical example of DLR M6 wing will indicate that adjointbased grid adaptation produce near optimal grid, and observably improve the accuracy of integral output function of interest. On the other hand, this approach could estimate the functional outputs of flow analysis with the same accuracy using a uniform refinement, Therefore this approach could reduce the cost that users of CFD have to know much information of their analyzing phenomena, and ultimately could remove the dependence on skills of users.
    • Show Sections

      Hide Sections

      In this work, a dynamical Sliding Mode Control (SMC) is computed for a wastewater treatment bioprocess taking place inside a Sequencing Batch Reactor (SBR). The controller is obtained using generalized observability canonical forms and SMC stabilization. More precisely, the control law design is done by means of combination between exactly linearization ideas and SMC. The control goal is to keep a low level of pollutant concentration. Numerical simulations are included to test the performances of the proposed control strategy.
  • Session 2 Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      This article proposes a novel method for multivariate optimization unconstrained named Stairs, based on optimization in one variable. The proposed method is compared against methods from the specialized literature such as the Multivariate Newton-Raphson and the Multivariate Fletcher-Powell. The instances of the problems were taken from real life situations. For a real comparison, metrics such as Number of Iterations, Number of Instructions and Processing time were taken into account. Stairs showed a speed improvement relative to the compared methods in problems that include difficult differentiation because it does not use matrix operations.
    • Show Sections

      Hide Sections

      In this work we run Mahout's FP-Growth implementation, which is a frequent pattern mining algorithm, on Internet Movie Database (IMDb) movie keywords. Results of mining keyword- keyword relations revealed that there exist linguistic relations like synonym, hyponym and hypernymy as well as more complex semantics. The study resulted in discovery of interesting keyword relations.
    • Show Sections

      Hide Sections

      Normally facial recognition algorithms identify faces by extracting landmarks or features from an image of the subject's face and then algorithm may analyze the relative size, shape of the eyes, nose, cheekbones, and jaw. These features are then used to search for other image with matching features. The proposed system in also performing the same series of tasks but the specialty of the system is its landmark selection method, which in based on somatology, a division of anthropology it is an oldest, reliable and scientific method for identification of the landmarks on human body.
    • Show Sections

      Hide Sections

      In game or virtual reality system, a highly dynamic virtual world can sharply reduce the efficiency of path planning system. Virtual characters repeatedly submit path requirement when map changes, which makes system frequently call graph search codes and slow down. In this paper, we propose a path-sharing method in path local repair which give the obstacle a property to store a “local-repair” path so that a obstacle can be considered as a navigator, which can reduce graph search times when virtual character execute local repair operation. Besides, we devise a path managing mechanism to maximally decrease graph search times and calculating redundance. Instead of inject the path into the virtual character to make them follow, we store every paths that each virtual characters are following in high level of the system, which make paths easily reused when same request is received.
    • Show Sections

      Hide Sections

      This work proposes a hybrid model which combines both unsupervised and semi supervised learning models, which aims at prioritizing emails of a particular user. The priorities are assigned taking into account the user social interactions and user assigned priorities of the email. The model classifies the incoming emails of the user into five different levels of priorities namely VERY HIGH, HIGH, MEDIUM, LOW, and VERY LOW. The proposed system develops a web interface which allows the user to register and login into the system , the system allows the logged in user to view his prioritized emails and his/her social interactions. The work presented here is novel as it allows the user to change the priorities of the emails dynamically.
    • Show Sections

      Hide Sections

      Advancements in clinical, portable, and wearable equipment for real-time collection of physiological data provide new opportunities for computerized diagnostics of developed pathologies, early detection of emerging abnormalities, and prediction of acute and critical events. However, many conceptual and algorithmic challenges for robust quantitative modeling in such applications remain unresolved. Variability analysis of physiological time series provides a generic framework for quantification of normal and abnormal states and their discrimination. Unfortunately, in many clinically significant cases it is hard to achieve robust “normal-abnormal” classification using this framework or other established diagnostic modalities. Recently, we have demonstrated that many problems in heart rate variability (HRV) analysis could be overcome when several complementary nonlinear dynamics (NLD) indicators (complexity measures) are combined using boosting-like algorithms. Such generic meta-indicators are capable to detect both single abnormalities irrespective of their specific type, and conditions specified by complex combination of different pathologies. Here we argue that aggregated probability-like output of these multi-component models could be effective for more detailed quantification of psycho-physiological states. These robust state representations could be used as early signals of emerging abnormalities and other negative physiological changes as well as for real-time prediction of acute and critical events. In addition, we propose some extensions to HRV analysis and possible ways of its application to other physiological channels.
    • Show Sections

      Hide Sections

      The recruit exam for school transfer in private medical college is held by a consortium every summer in Taiwan. The Private Medical College School Transfer Consortium Admission pursued student placement in accord with student's scores, student's preferences and department's subject weights. It must be on a fair basis. This study aims at developing an algorithm achieving the fair placement. Using the database technique we fulfill the requirement of student placement for medical college transfer exam in 2005. This algorithm is also suited for applying to other resources placement.
    • Show Sections

      Hide Sections

      Condition monitoring and fault diagnosis of mechanical equipment is to guarantee the normal operation and reliable work, is an important link of the universal attention. This article explore a strategy combining theory and experimental research, built a set of theory and method based on motion modality and the information entropy in fault diagnosis of automatic mechanism. The purpose is to solve the following problems, the weak fault signals in the short-time and transient vibration response signals are easy to be drowned in practical application, effective and sensitive characteristic parameters are difficult to be extracted, accurate positioning of fault and real-time diagnosis are difficult to be realized. In the paper using the motion modality and information entropy to put forward new ideas and methods for the short-time and transient vibration signals analysis processing and feature extraction, and is applied in artillery automatic mechanism field, which expands the research scope of mechanical fault diagnosis discipline.
    • Show Sections

      Hide Sections

      A dual band-notched printed planar antenna is proposed for ultrawideband applications. The proposed compact antenna comprises of a rectangular radiating patch and a modified partial ground plane. A U-shape parasitic strip and a rectangular split ring resonator are placed below the substrate to achieve dual band-notched characteristics. The measured results show that the proposed antenna achieved a wide impedance bandwidth from 2.9 to 11 GHz, defined by 10 dB with one notch frequency band at 3.3 –3.8 GHz and the other at 5.6–6 GHz. A stable radiation pattern and flat gain except in the notched band makes the proposed antenna suitable for being used in UWB applications.
    • Show Sections

      Hide Sections

      In this paper we propose an improvement to the clustering heuristic algorithm K-means. This improvement has been tested with databases of breast cancer. Today, clustering problems are everywhere; we can see its application in data mining, learning machines, knowledge discovery, data compression, pattern recognition, among others. One of the most popular and used clustering methods is the K -means , on this algorithm has been worked hard, basically have made several improvements, many of these base d on the definition of the initial parameters. In contrast, this paper proposes a ne w function to calculate the distance; this improvement comes from the experimental analysis of the classical algorithm. Experimentally, the improved algorithm showed a better quality solution being applied to population databases of breast cancer. Finally, we believe that this improvement may be useful in many types of applications, so this application can serve as a support tool for research on breast cancer and as a decision making in the allocation of resources for prevention and treatment.
  • Session 3 Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      Drill wear is an important issue in manufacturing industries, which not only affects the surface roughness of the hole but also influences drill life. Therefore, replacement of a drill at an appropriate time is of significant importance. Flank wear in a drill depends upon the input parameters like speed, feed rate, drill diameter, thrust force, torque and chip thickness. Therefore, it sometimes becomes difficult to have a quantitative measurement of all the parameters and a qualitative description becomes easier. In those cases, a fuzzy neural network based prediction model becomes more useful in tool condition monitoring (TCM). This paper describes the application of such a fuzzy neural network model in prediction of drill wear. Here chip thickness has been expressed as fuzzy linguistic variable to be used as an input to the artificial neural network (ANN). Fuzzy Back Propagation Neural Network (FBPNN) and Fuzzy Self-Organizing Feature Map (FSOFM) networks have been tried to develop drill wear prediction system in drilling a mild steel work piece. Finally comparative performances of different TCM strategies along with different ANN architectures tried in the present work for development of a robust drill wear prediction systems.
    • Show Sections

      Hide Sections

      In this work, denoising of multichannel electrocardiogram (MECG) signals is proposed by applying multiscale principal component analysis (MSPCA). The desired quality of processed signals is achieved by selecting the principal components (PC) based on energy features in selected wavelet subband matrices. The number of PC selection, to achieve the target quality of denoised signals, is based on cumulative percentage of total variation of variances. The choice of multiscale matrices and selection of eigenvalues preserve the target energy in the processed signals. Input and output signal-to-noise ratio (SNR) is measured for quantitative performance. Signal distortion measures are evaluated using percentage root mean square difference (PRD) and wavelet energy based diagnostic distortion measure (WEDD). SNR improvement of 31.12 dB has been found with better denoising effect using database of CSE multilead measurement library.
    • Show Sections

      Hide Sections

      The efficiency of information application affected the operational efficiency of the entire supply chain in logistics supply chain. This paper focuses on the application method of information technology in supply chain management process. Market-oriented supply chain model was analyzed. Key information technologies and information technology system for supply chain management were researched. The integration application of information technology in supply chain process will promote enterprise efficiency to improve.
    • Show Sections

      Hide Sections

      In this paper, we are introducing Augmented Reality (AR) technology, and its unique standards that can be applied with multiple channels depending on the needs of developers. The research has been done through the development of a system named ‘Augmented Reality Band’ to present a new form of applications of AR in the form of a virtual music world through clothing and to allow control over the clothes. The system was developed by using FLAR Toolkit and offers the users to use it through their websites. The user can run the system online from anywhere by just wearing a printed shirt with AR Code embedded in musical instruments. The user will also be able to work with the system. In addition, the authors assessed the quality of the system from technical professionals. It was found that the quality level was very high (4.64/5.00) and the users satisfaction evaluated by a sample group was found in a satisfactorily level (4.22/5.00). The discussions and suggestions for further work were presented at the end.
    • Show Sections

      Hide Sections

      Classification is one of the most important techniques in Data mining. Various applications apply these classification techniques to extract knowledge from huge amount of data collected from various sources. Gathering data from medical sources is one of the most popular one. Among all these datasets, cancer datasets is a real eye catcher for the researchers. Classifying these cancer dataset is a real challenge because of their high dimensionality and enormous size. Different existing classifiers are handy in classifying these high dimensional datasets. Decision tree classifiers are good candidates for this task while we have used a boosting algorithm (AdaBoost) for classifying along with the decision tree classifiers. In this paper, the performances on the accuracy of classification and time to build the model for decision tree induction classifiers and AdaBoost is analyzed.
    • Show Sections

      Hide Sections

      Agents have now replaced the role of buyers and sellers in online auctions. These agents are equipped with various bidding strategies to maximize their chances of obtaining the item successfully with some profits. Thus, in this work, we considered the market economy of utilizing bidder agents. In this work, we compare the performance between agents of the same type and agents of different types in a simulated marketplace in terms of the winner's utility and the number of winning auctions. Based on the results, having bidder agents with different bidding strategies allow the human buyers to select the best bidding agents to bid on their behalf and promotes healthy competition among the bidders in any online auction.
    • Show Sections

      Hide Sections

      Data structure is very important for computer science. Since DNA computing appeared, many researchers have use this computing paradigms to solve problems, especially NP-hard problems because of its advantages of parallelism and mass storage. In order to be used in practice, DNA computer need to availably organize data as electronic computer. Stack is a data structure widely used in practice. In this paper, a method for data storage of stack based on hairpin structure in DNA computer is described. And data operation method is demonstrated. The idea of this paper is feasible to solve data organization problem for DNA computer so that it can help DNA computer develop for practical application.
    • Show Sections

      Hide Sections

      Analyzed the principles of the ID3 algorithm, this creates rules based on the concepts of entropy and gain with prepared data set. On the other hand, naïve Bayes classifier, allow us to classify through of the prepared data set considered probabilistic evidence. We propose a hybrid schema based on the ID3 algorithm and the naïve Bayes classifier that let us to improve the accuracy in classification tasks. We believe that this may be useful in many types of applications, so this schema serve as a support tool for research as a way to make decisions. Finally, we use experiment to prove that the hybrid schema increase the accuracy being applied to population databases of breast cancer.
    • Show Sections

      Hide Sections

      Due to increasing level of complexity and huge number of operational aspects in manufacturing systems, quality control is of crucial importance. In order to prevent production losses due to defective products, some measures must be taken. In this paper a probabilistic approach based on large scale baysian network (LSBN) is presented that enables an effective defect root cause analysis (DRCA) for quality improvement in product of large scale manufacturing systems. The proposed approach is capable in finding the most probable root cause in defective product (several defects detected simultaneously). The proposed approach is based on Baysian inference for reasoning under uncertainties and wide spectrum of system sizes. It is model based and accumulates the system knowledge within the problem domain, which data is gathered from experience, various manufacturing data source and defect related knowledge. The system learning can be supervised by user feedback on the actual root cause. The general DRCA is applied to the defective vehicle body surfaces in paint shop of an automotive assembly plant.
    • Show Sections

      Hide Sections

      Notebook computers currently play the important roles in human life because of their ability, portability and mobility. To buy a notebook computer, one should look out for a product that packs together best features at an affordable price. However, highly competitive business of notebook computers makes the difficulty for buyers to determine. The objective of this paper is to apply the fuzzy AHP in determining the relative importance of the decision criterion in order to eventually select the best product of notebook computers. The fuzzy AHP is capable to efficiently handle the fuzziness of the data involved in the multi-criteria decision making problem of this study.
  • Session 4 Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      Today, plastic manufacturing process to something significantly converted to achieve the complex manufacturing process with good quality and yet economical, serious efforts are carried. One of the plastic injection method and the more commonly used methods of shaping plastic yen and the plastic products industry is a particularly mobile. Nonetheless, the molding industry to reduce the heavy costs and unacceptably long delays during construction template, try to be possible to use methods that reduce these costs, which ultimately reduces cost. While the use of these scientific methods to achieve quality caused by factors that attract their customers are.
    • Show Sections

      Hide Sections

      In this paper we state a novel metaheuristic based on Deterministic Finite Automaton (DFA) for the multi-objective optimization of combinatorial problems. First, we propose a new DFA based on Swapping (DFAS). DFAS allows the representation of feasible solutions space of combinatorial problems. Last, we define an algorithm that works with DFAS, it is named Exchange Deterministic Algorithm (EDA). EDA has three steps. The first step consists in create the initial solutions, the second step improves the initial solutions and the last step uses transitions between the states of the DFAS for improving the solutions. EDA was tested using well known instances of the Bi-objective Traveling Salesman Problem (TSP). EDA results were compared against Exhaustive Techniques from the specialized literature using Multi-objective Metrics. The results shows that EDA solutions are close to the Optimal Solutions.
    • Show Sections

      Hide Sections

      The Compact Linear Collider (CLIC) is a next generation electron positron linear collider designed for high precision measurements of physics beyond the Standard Model which is envisaged in the post Large Hadron Collider (LHC) era. One of the main challenges to achieving high precision measurements is the generation of stable electron positron bunches without unwanted satellite trails as commonly encountered with current thermionic gun technologies. A possible alternative technology candidate currently being studied is the use of a multistage (IR, GR, UV) 1.499 GHz class 4 mode locked laser photoinjector system for the generation of clean bunches. The CLIC Test Facility 3 (CTF3) at CERN is a test facility for the research and design of key components of the CLIC collider. This paper documents characterization measurements made on the CTF3 laser and proposes a new approach to pulsed laser amplitude stabilization using Kalman estimated measurements and FPGAs for data fusion.
    • Show Sections

      Hide Sections

      This paper presents a system base on Eigen face facial expression recognition algorithm that detects faces and recognised facial expression in real time, it recognises the person by comparing the characteristics of the expression with a known expression. The system will also used motion tracking to show the orientation of you face with the aid of visual controls. Eigen face detection algorithm is chosen for this system because of the speed of the detection algorithm in detecting faces in compact environment such as homes and offices. At the end of the paper we present a system that provides an improved facial express recognition, which is highly tolerant of different lighting condition while delivering consistent result.
    • Show Sections

      Hide Sections

      A method based on a combined use of sparse representation and data fidelity constraints was proposed to remove the noise in the measured spectrum signals. Experiment with signal to noise ratio (SNR), root mean square error (RMSE), and normal correlation coefficient (NCC) three indicators evaluation de-noising performance. Simulation results show that: in comparison with the wavelet denoising methods, the method in this paper has better de-noising ability, and gain better evaluation norm.
    • Show Sections

      Hide Sections

      With the popularity of web applications, the phenomenon of web falsified and spreading of the trojan and virus via web vulnerabilities is becoming increasingly common. To solve this problem, we use a web script text feature analysis for webpage, discussed of the status of security-related analysis. Firstly, according to the current analysis of trojan web links and existing problems, we propose a model for webpage links analysis whin its source code, and describe the security-related algorithm of web links. After that we describe the algorithm with mathematical language. Finally, we analyze and summarize the experimental results, and verify the reliability and rationality of algorithm.
    • Show Sections

      Hide Sections

      This paper proposes a novel method for the unconstrained multivariate optimization, which compared against methods from the specialized literature such as Newton-Raphson and Fletcher-Powell, improves the Processing Time. The proposal consists of cover the biggest part of the solution set of the function; evaluating points generated with simple operations to reach the goal of reduce the time. Finally, the novel method has been proved in an application problem.
    • Show Sections

      Hide Sections

      This paper states a novel method based on the Hill Climbing for unrestricted multivariate optimization. The proposed method was compared against method from the specialized literature such as Multivariate Newton-Raphson and Multivariate Fletcher-Powell. For making a real comparison, metrics such as Number of Iteration, Processing Time and Stability of the Solution were taken into account. The results showed that the proposed method was the best with a good performance in the metrics, in some cases, of 100% out of 100%.
    • Show Sections

      Hide Sections

      The simplest mathematical modelling in epidemiology called SIR model ignore population demography such as births, deaths and migration. In fact people have interaction between subpopulation and carriers of disease increase from one person to another. This research aims to simulate the migration population of Thailand. We matched emigration and immigration statistics by dividing the population of the initial district to every district by percentage. In this initial stage of the project, we simulate infected situation of metapopulation model in humans by using the real migration data of Thailand and conclude that population movement, together with births and deaths have an effect on the number of infections.
    • Show Sections

      Hide Sections

      The reliability of on-board computer system is vital to satellite operation. It is necessary to perform quantitative reliability analysis for on-board computer in the early design stage. However, the traditional analysis methods are inefficient for the complex system with repairable components. So we proposed a new approach based on Repair Fault Tree to evaluate the system reliability in this paper. Moreover, we take the divide-and-conquer strategy to analysis the reliability of Repair Fault Tree. The analysis results show that our approach can not only accurately analysis the repairable system, but also reduce the modeling complexity.
  • Session 5 Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      This research is mainly about detecting and preventing plagiarism among students. It provides the capability to detect similarities between documents submitted by students. The main focus of this research is to perform a study on how to detect textual plagiarism. Textual plagiarism is the most noticeable and serious form of plagiarism. This form of plagiarism can be categorized as a form of direct stealing without proper acknowledgment and consent of other's work. Rabin - Karp Algorithm which is a string searching algorithm that uses hashing to compare the strings will be use in detecting the similarities among the files. As a result, the developed application is able to make comparison in order to find for the similarities among them. The percentage of similarities will then be generated in a text format for reference.
    • Show Sections

      Hide Sections

      In this paper, we present a method based on Gaussian vessel detector and line operator to detect the center of optic disc center. The Gaussian vessel detector is used to remove all the vessels so that the possible influence of vessel which could occur during the line operation would be prevented. Next, as OD usually has a circular brightness structure with a brightness variation from relatively dark to bright along the direction from the boundary to the center of OD, a line operator is thus designed to search such structure in each pixel in the fundus image, and the direction of each line segments is determined by the maximum response. Proceeded with several denoise methods, coordination of points with the brightest intensity are averaged to be the detection result. The proposed method achieved a success ratio of 96.3 on S.T.A.R.E fundus database and is also tested on several images which suffer from severe bad illumination or noise.
    • Show Sections

      Hide Sections

      Quality inspection is an important aspect of modern industrial manufacturing. In automotive industry, automatic paint inspection is of crucial importance to maintain the paint quality. In most worldwide automotive industries, the inspection process is still mainly performed by human vision, and thus, is insufficient and costly. Therefore, automatic paint defect inspection is required to reduce the cost and time waste caused by defects. In this paper a new approach is proposed for detection of defects on painted car body through serial paint images and subsequent classifying of the localized defect types. Initially, defects are detected and localized by using a rotation invariant measure of the local variance (VAR) operator and next, then classified by using learning vector quantization (LVQ) neural network.
    • Show Sections

      Hide Sections

      We describe a strategy for optimizing bi-objective combinatorial problems. Initially, we design and implement a metaheuristic of complexity O(n̂4 ) for optimizing combinatorial problems. This metaheuristic is appointing Metaheuristic Deterministic Interchange on Automata with Simulated Annealing (MIDRS - Metaheurística de IntercambioDeterministasobreAutómatas con RecocidoSimulado). MIDRS based its strategy on the theory of Deterministic Finite Automata Multi - Objective.Subsequently, we analyzed the behavior of the technique by varying the weight ratio between the objective functions. Finally, MIDRS is contrasted with high-impact global metaheuristic such as, algorithms based on Ant Colony, Evolutionary Techniques and Strategies Local Search.
    • Show Sections

      Hide Sections

      This paper proposes a development method to control thermoelectric by using customization proportional controller. Moreover, the system applied heat disturbance observer to estimation value heat disturbance outside object into the system. In the equipment part, Peltier device is selected to be the thermoelectric module device. This device is a type suitable for analysis thermal system, because the device has a fast respond and good stability, it is comparison with the device in this group. Also this proposal used high speed controller (FPGA) for the programming to control system. Moreover, about the circuit driver current and sensor temperature used digital system. It protected noise working well and easy to program with FPGA. The result of this research can be guideline for analysis to control thermal systems or to control temperature for various applications.
    • Show Sections

      Hide Sections

      Knowledge of human tongue color and its distribution are essential for image analysis in the computerized tongue diagnosis system. Based on a large and comprehensive tongue image database, which contains over 5200 high-quality tongue images, this study aims to provide the first statistically described human tongue color distribution in CIE xyY color space and its application on color feature extraction. Representative color pixels are firstly extracted for each tongue image to reduce the total number of color pixels by the proposed blockimage boundary searching algorithm, and then they are combined together to generate the tongue color distribution. After that, one-class support vector machine (SVM) is employed to mathematically describe the boundary of this color distribution thus to separate tongue or non-tongue colors. Finally, we show the application of the proposed color distribution on color feature extraction. We believe this work is the most comprehensive and detailed study on tongue color distribution to date.
    • Show Sections

      Hide Sections

      This study provides the development of an instrumented walker which is focused on the forces transferred through the resultant loads of the subject hand and upper extremity kinematics. Based on the theoretical and experimental studies, a novel system is proposed in the walking aids that are capable of reducing the stress experienced by the subject without changing the usage pattern of the walker. The bilateral upper extremity kinematics and kinetics data are acquired through Qualisys motion capturing which incorporate with Visual3D motion analysis system. These experiments are carried out with seven healthy subjects, right handed young adults and elderly. By employing the experimental data, a walker support system was developed and verified the design using the same subjects through repeated experimentation. Substantial changes of resultant forces are observed for walker support system implementation.
    • Show Sections

      Hide Sections

      In a video sequence, object tracking became difficult if the object is occluded partially and if the background is same as that of object color. Here under these two constraints object is tracked successfully by applying the partitioning of the object technique and then considering the confidences of each region. The object is partitioned such that it's surrounding is also taken into consideration. By exploiting the attention shifting among local regions of a human vision system during tracking, a robust system is designed. By matching and comparing with the highest confidences region, a weight is assigned to each partition region of an object. The algorithm is tested for the constraints of partial occlusion and same background.
    • Show Sections

      Hide Sections

      The Ant Colony method is one of the most used metaheuristics in the analysis of Traveling Salesman Problem (TSP). Our objective in this research is to take one of the instances proposed by the research group of University of Heidelberg in Germany and apply this method to obtain the optimal solution that has been found so far. First of all, it is important to define aspects such as the TSP, metaheuristics, and then analyze how works the Ant Colony method, the characteristics that has the instance that we chose and finally the pseudocode. All this was done by consulting, and investigating, papers, articles and research done earlier by other scientists, mathematicians, and even students like us.
    • Show Sections

      Hide Sections

      This work investigates how opera singers manipulate timing in order to produce expressive performances that have common features but also bear a distinguishable personal style. We characterize performances not only relative to the score, but also consider the contribution of features extracted from the libretto. Our approach is based on applying machine learning to extract singer-specific patterns of expressive singing from performances by Josep Carreras and Placido Domingo. We compare and contrast some of these rules, and we draw analogies between them and some of the general expressive performance rules from existing literature.
  • Session 6 Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      This study conducts test to measure the quality of e-government website of five Asian countries via web diagnostic tools online. We propose a methodology for determining and evaluating the best e-government website based on many criteria of website quality. This model has been implemented by using combination of Grey Analysis (GA) and Analytical Hierarchy Process (AHP) to generate the weights for the criteria which are better and more fairly preference. The result of this study confirmed that by applying combination of GA and AHP model approach has resulted in significant acceleration of implementation, raised the overall effectiveness and enabled more efficient procedure.
    • Show Sections

      Hide Sections

      In this paper, the authors use the relevant feed consumptions and corresponding weight data released by Arbor Acres (AA) and the method of regression to gain the production function (feed consumption-weight function) of Arbor Acres broilers, and then apply the principle of margin balance to respectively obtain the sale weight from which the largest profit can be made in accordance with the combination of different feeds and broiler prices, and hence formulate the zoom table for the optimal sale weight. This table can provide a scientific and simple rapid access tool for relevant producers to acquire the optimal sales weight from which the maximum profit can be made.
    • Show Sections

      Hide Sections

      In this article, we have proposed a methodology to provide an extra level of protection for mobile phone content from Bluetooth and IP viruses. Nowadays, these kinds of viruses can quickly spread through various means such as Bluetooth and traditional IP-based applications. The proposed methodology can stop the broadcasting of viruses on the cell phones, instead of the signature-based solutions currently available for use in mobile devices. Furthermore, such methodology investigates the survivability of mobile phone content against viruses and Bluetooth attacks. When a potential virus is detected, the TeleCOM providers send an antivirus to the target to stop this type of virus.
    • Show Sections

      Hide Sections

      Since web-based CALL was implemented in China, researchers have found it successfully solved the chronic problems that college English teaching has had in China. It is believed that web-based CALL can solve the problems in Chinese college English teaching, to a certain degree, by providing students with a computer-assisted learning environment which allows students a higher achievement with fewer classroom teaching hours. The major advantage for using web-based CALL is that the computerized learning environment enables students to enhance their autonomy by providing them with more learning opportunities and ensuring student-centered instruction.
    • Show Sections

      Hide Sections

      With the development of economy, human's living environment is being destroyed badly. The voice for protecting the environment is more and more loudly and the environmental audit becomes increasingly important. This article starts with the fundamental theory of environmental audit, explores the emphasis and methods of environmental audit, and discussions .In the end Conclude :Formulate and perfect the legal system of environmental audit. Carry out promotion and education program; Environmental auditors shall build up their competence.
    • Show Sections

      Hide Sections

      This paper briefly illustrates the Bible's position in Western literature. The Bible, as the essence of Western civilization, is the gradual fusion of the East and the West, that is, the ancient Hebrew culture and the ancient Greek culture. At the very beginning, The Bible first was persecuted by the Roman Empire, and later used by the rulers; the Bible's influence is reflected in all aspects of Western society. The paper superficially talks about the Bible's impact of on the 17th century British literature.It analyzes the Bible images of the two outstanding writers' works in the 17th century, as well as Milton and Bunyan's own lives, to show a side of the Bible's impact on British literature.
    • Show Sections

      Hide Sections

      The Internet as the pioneer of the network is sweeping the globe, providing quick and inexpensive means of transmission for digital information products. Vivid image information is widely used for all of us, but it also brings safety problems. In order to protect the data owners' interests, also for personal privacy, copyright and other security reasons, a reliable encryption technology must be used in some image data.Based on the ergodicity, mixed and index divergent characteristics of chaotic systems, this paper illustrates how the key sequence generated by chaotic system encrypts image and scrambles the encrypted result. This method is of very high safety and practicality. The simulation results also show that the method has good cryptographic properties.
    • Show Sections

      Hide Sections

      There are some methods for optimization problems, they differ in the way the reach the optimum, among these methods, those which are based on the function's gradient have a great advantage as they find the fastest way to reach de objective, here we show three methods that base on this principle, two of them are part of our course, and a third one which we would like to propose, as it turns out to be very effective.Through this research we achieve to implement and built a serial of algorithms that recreate the steps from mathematical structures design for solving the many challenging optimization issues that are found in an engineering career. Based on our theory, seen on this course, and several extra sources we were provided with tools strong enough to understand and rebuilt such logic. The processes and results are exposed in this journal. Not only are we going to solve a proposed example, but also we're going to show how three different methods based on the same primitive concept can differ in quality, accuracy and speed.
    • Show Sections

      Hide Sections

      Things function is to collect items of information using sensing system, and through the transmission and intelligent processing system to complete the goods and information exchange between objects, and ultimately goods and goods, people and goods interconnection and management. Based on the unique features of the modern things have become an important logistics and marketing management techniques, but will also gradually enter people's lives. Internet of Things is an information network to connect objects, with the full capacity to perceive objects, and having the capabilities of reliable transmission and intelligence processing for information. The research and development of Internet of Things allows us to obtain real-time information of anything. Based on such research background, this paper focuses on real-time information theory and real-time information acquisition problem in Internet of Things.
    • Show Sections

      Hide Sections

      The western ethnic minority areas are currently in the face of the lack of educational resources. Such difficult position makes the planning and design of professional career and career guidance a key content that the society pays wide attention to. Focusing on the characteristics of the western educational resources and the economic development, relevant vocational students are appointed in their design of professional careers. This action has promoted students to carry out professional training in a completely real environment. Schools can actively encourage the students to obtain employment or start an undertaking. They can promote their enthusiasm through the career guidance forums in all aspects. In this way, students can lay a solid foundation for their future life and development.
  • Session 7 Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      In many applications that require decision making, dependency is on the data obtained from various sources. Each source of information provides a different perspective to the problem domain. With the existing data that is available and new perspective that evolves at later stages, there is need to learn incrementally. In this paper we recognize the similarity between the new perspective and the existing knowledge based on their pattern. The proposed work, referred as ‘Multi-Perspective based Incremental Learning’ [MPIL], effectively modifies the knowledge with available perspectives, understands the patterns of the data to determine the update and at the same time maintains them for reuse. With application to education sector, experiments show that the proposed approach exhibit better decision-making capacity in a multi-perspective environment.
    • Show Sections

      Hide Sections

      Artificial neural networks are effective for complex pattern recognition and hence popularly used in classification. The paper has developed a Hopfield network for classifying non-correlated English alphabets (e.g., ‘A’, ‘L’, and ‘S’). The objective is the recognition of characters with noise and lateral translations at minimum time. The usability of Hamming and Euclidean distances are checked for accomplishing such tasks. The paper concludes that both are equally effective to classify alphabets, but the later is more time-taking.
    • Show Sections

      Hide Sections

      In this study, a double-stage process is proposed for portfolio selection. In the first stage, a clustering method is used to identify good quality assets in terms of α-reliability asset classification. In the second stage, investment allocation in the selected good quality assets is optimized using a genetic algorithm based on stochastic model of portfolio selection with minimum transaction lot and it's the α-reliability decision. Through the two-stage α-reliability model process, an optimal portfolio can be determined. The experimental result has showed that its application in portfolio selection is reliable and useful.
    • Show Sections

      Hide Sections

      A novel automated Fuzzy Optimal Thresholding Technique to differentiate malignant melanoma lesions from the normal skin is proposed in this paper. Segmentation of pigmented skin lesion images pertaining to malignant melanoma diagnosis is achieved with the help of fuzzy Optimal Threshold value and Mathematical morphology. The input melanoma skin lesion color images are acquired and preprocessed to remove noise as well as some unwanted structures like hairs, etc. The preprocessed image is smoothened by morphological operations, converted into a gray scale image and binarized by using Fuzzy Optimal Threshold value obtained by Fuzzy Gaussian Measures. The largest label region, a cancer area from the binary image is identified and the borders are extracted and imposed on the original skin lesion image. The efficiency of the algorithm is tested over 100 images and proved up to 90% of the accurate extraction of skin lesion borders.
    • Show Sections

      Hide Sections

      In recent years, many studies have been conducted in Mobile Ad Hoc Networks field in order to make a virtual infrastructure consisting of nodes. The common goal of all was to select a node called cluster-head which guarantees relationships between nodes. In this paper, we have presented a new clustering algorithm in Mobile Ad Hoc Network based on nodes weight. For calculating node weight we present four new parameter, congestion, stability, number of nodes moving towards a node and battery remaining. The goal of this algorithm is to decrease the number of cluster forming, maintain stable clustering structure and maximize lifespan of mobile nodes in the system. In simulation, the proposed algorithm has been compared with WCA, MOBIC and the Lowest_ID algorithm. The results of simulation reveal that the proposed algorithm achieves the goals.
    • Show Sections

      Hide Sections

      This paper is on the extension of the Immediate Constituent Analysis (IC Analysis) in linguistics based on the properties of Binary Tree as a dada structure. Compared with the original one, three breakthroughs are made in the extended one through the study on the configuration of Binary Tree, the employment of mathematic inferences, and the correlation between nodes and levels in the tree and their counterparts: linguistic units and stratifications in natural language (English). They include classification of the IC binary tree, revelation of the mathematical relation between linguistic units or constituents (represented by nodes) and linguistic stratifications (represented by layers in the tree), and implications for computer-aided foreign language teaching and computer process of natural language (English).
    • Show Sections

      Hide Sections

      The electrospray is sometimes improperly called electrohydrodynamic atomization. Electrohydrodynamic (EHD) has been applied in many areas, such as EHD atomization, EHD enhanced heat transfer, EHD pump, electrospray nanotechnology, etc. High voltage is applied to a liquid supplied through an emitter. Ideally the liquid reaching the emitter tip forms a Taylor cone, which emits a liquid jet through its apex. From the electrostatic field, the electric body forces have been determined and included in the Navier-Stokes equations. The model does not include any current. The predicted velocity fields for case1 and case2 were found to be consistent with published results. Simulation is carried out using ANSYS FLUENT system. The approach in this work was to simultaneously solve the coupled (EHD) and electrostatic equations. The fields of velocities and pressure, as well as electric characteristics of EHD flows, are calculated. The model does not include a droplet break-up model.
    • Show Sections

      Hide Sections

      We propose a new approach for dynamic software updates. In this paper, for the dynamically updating of a real time application, a new method has been proposed in which Rate Monotonic Scheduling algorithm has been used. This approach allows updating applications that until now could not be updated at runtime at all or could be updated but with a possibly indefinite delay between the time an update is initiated and the time the update is effected. Unlike existing approaches, we allow arbitrary changes to functions active on the stack and without requiring the programmer to anticipate the future evolution of a program. We argue, using actual examples, that this capability is needed to dynamically update common real applications. At the heart of our approach is a stack reconstruction technique that allows all functions on the call stack to be updated at the same time to guarantee that all active functions have the same version after an update. This is the first general approach that maintains both code and data representation consistency for real time applications.
    • Show Sections

      Hide Sections

      Through a multiple regression analysis of relations between executive pay and corporate performance, control of large shareholders as well as the board's monitoring in a sample of 100 private listed companies from stock markets of Shanghai and Shenzhen (2008), the present paper finds that, there is a positive correlation between corporate performance and executive pay. The board's monitoring will strengthen this positive correlation, while the control of large shareholders may distort it. Executive pay is based on corporate performance. Once the ultimate proprietor starts to extract private interests, executive pay will turn into a major means to manipulate executives. So, it is beneficial for supervising the control of large shareholders and enhancing the board's independence to improve the level of listed companies' governance.
  • Session 8 Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      This paper mainly discusses a new Business English Translation teaching model—START Model. This teaching model is a concrete operational translation teaching method, which can not only develop the teachers' guide and organization, guide the students to think of the reasons of all kinds of translation, find out and conclude translating methods actively, sentiment and grasp the true essence of translation.
    • Show Sections

      Hide Sections

      This study selected 183 athletes in service randomly from the athletes of Zhejiang province and used ABQ which revised by Cheng Zuosong in China for surveying. Then, based on the investigation results through SPSS16.0 software analysis and get the following conclusions: (1) Zhejiang athlete burnout level is not high, belongs to the middle level; There exist significant differences among different age, training year, family economic situation and key players in the team or not in the burnout level. (2) Multiple regression analysis showed that defeat and sports injury in Zhejiang athletes are the effective prediction variables athlete burnout. On the latitude in burnout, defeat and interpersonal relationship is the emotion/physical effective prediction variables; Sports injury is the effective prediction variables achievement decline; Defeat, sports injury and interpersonal relationship are movement degeneration of effective prediction variables.
    • Show Sections

      Hide Sections

      Traditional client/server mode of backstage communication software is usually designed as gateway/core framework. However, this kind of design is imperfect due to its unfavorable effects on expansion and low code reusability according to previous experience. The working mechanism of workflow is used for reference in this paper to draw a new design of backstage communication system framework that is customized, and is conducive to expansion and can be adaptable to various businesses. Upon examination, this new design can meet performance and functional requirements of information broadcast system and provide important technical support for the system platform. Moreover, the implementation process is versatile and is a better solution for the related works.
    • Show Sections

      Hide Sections

      Assessment is a common device for checking and testing the progress of foreign language teaching and learning in order to achieve the first-hand data for the future work. This paper elaborates several points concerning the evaluating and developing assessment tasks in TESOL program in New Zealand.
    • Show Sections

      Hide Sections

      Image segmentation is a fundamental problem in image processing, the traditional watershed image segmentation algorithm has a simple, intuitive, and many other advantages, however, there are over-segmentation, great impact of noise and difficulty for image pre-processing problems etc. so that propose an improved algorithm based on morphological reconstruction, which can solve the noise problem and the reducing difficulty of pre-treatment and improve the segmentation results. In the traditional watershed algorithm, apply a faster sorting algorithm, a reasonable course of treatment uncertainties in flood, it is quick and effective for image segmentation. Experiments show that the algorithm can effectively eliminate the phenomenon of over-segmentation, and good to improve the computing speed and accuracy for watershed algorithm.
    • Show Sections

      Hide Sections

      This study aims to explore the centrality of curriculum structure among graduate institute of adult education from social network analysis. The subjects were 44 curricula from the first established graduate department of adult education in Taiwan i.e. the Development of Adult & Continuing Education (NTNU). The research method adopted UCINET 6.198 to understand relationships of keywords. The results of study showed that “education”, “culture”, “social”, “adult” were the foundation in curriculum structure of Development of Adult & Continuing Education. According to influences of the different keywords, it can be considered to make curriculum plan provided to adult learners as guidance.
    • Show Sections

      Hide Sections

      The final objective in Remote Sensing activities is to extract certain desired information from the remotely sensed imagery/Signals. With major advances in the development of remote sensing devices in recent years, the need for advanced information processor techniques is quite evident. Progress on information processing must at least is able to keep up with increased data and increased demand for more accurate & detailed information from the data. Classification is one of the major tasks to be done during the image processing. We classify objects in order to make sense of our environment by reducing a multiplicity of phenomenon to a relatively small number of general classes. On a walk, for e.g. you might point to trees, tractors or swans. What you are actually doing is to identify an observed object and to allocate it to a pre existing class or ‘give it a name’.
    • Show Sections

      Hide Sections

      This paper presents a compilation of some kind of application examples of unrestricted multivariate optimization problems, using two principal methods one of this is Newton-Raphson method (NR) which is commonly used to calculate roots of a polynomial with only one variable, just with certain kind of changes to calculate roots with multiple variables. We use too the Fletcher-Powell method (FP) for multiple variables; this both methods are commonly used in some problems of optimization, in our case just with some changes to solve some cases where we are not working just with one variable but multiple variables and always looking for the optimization in different problems of some engineering areas. The novel algorithm is an improvement over the Newton's method and can be classified as quasi-Newtonian.
    • Show Sections

      Hide Sections

      The main aim of this paper is to present a new method of detecting outliers in skewed rainfall observations based on an adjusted boxplot. The proposed adjusted boxplot adopts the medcouple to modify the classic method that often erroneously declares many points as outliers when data are skewed. The medcouple make the adjusted boxplot not only find skewness, but also is insensitive to outliers.. Synthetic data are given to test the performance of the roposed method. The results illustrate that the proposed method can be used as a fast and automatic outlier detection tool for skewed rainfall observations.
    • Show Sections

      Hide Sections

      Due to insufficient standard productivity measurement system, the Construction labor productivity has been declining over a decade. In addition, the influences of various qualitative factors on labor productivity have not been incorporated accurately during the scheduling and estimation of the project durations. Therefore the objective of the study is to estimate the labor production rates by using Artificial Neural Network (ANN). Qualitative factors influencing the rates such as weather, project location, site conditions, etc. have been identified on project sites during the measurement of production rates values for concreting activities. Data obtained from seven building project sites have been used in the ANN for estimating labor production rates. The results obtained with the least error can be used as reliable and valid production rates for the Malaysian construction industry.
  • Session 9 Computer and Computational Intelligence

    • Show Sections

      Hide Sections

      This paper deals with the various applications combining with the strengths of BBO itself as well as its hybridization with other optimization techniques to achieve the fruitful results in various applications. BBO has also been combined with Case based reasoning for the evaluation purpose.
    • Show Sections

      Hide Sections

      The main purpose of incross-site scripting (XSS) is to steal the user's sensitive information, as its behavior is to send user's sensitive information to a third party without the user's authorization. To protect the security of web users' information effectively, this paper emphasizes to discuss the theory and flow for attacking XSS with dynamic code obfuscation (DCO) technology, and proposes a behavior-based XSS detection technique. The results of our experiment have demonstrated that the behavior-based XSS detection technique proposed in this paper is feasible in practice.
    • Show Sections

      Hide Sections

      Along with the network and digital of the information technology, the construction of microteaching environment comes into the digital stage, and some new teaching modes of microteaching also appear. But the evaluation of microteaching becomes behind the step of digital stage. The interactive evaluation of Moodle platform has Diversified Instructional Assessment activity, and then can be used to the web microteaching evaluation. By using the interactive evaluation of Moodle platform, the proper evaluation activity and reasonable blended teaching can solve some problems of traditional microteaching, such as lack of time and equipment, and then cause the reform of microteaching.
    • Show Sections

      Hide Sections

      According to the model of the database design process and steps, the restaurant chain system as an example, analysis of the system of business. In order to achieve the business needs for the purpose, discussed in detail the system based on Oracle conceptual data model for database design process, discussed the model of the database design of the basic techniques and methods. The design pattern has been used in real projects, proved to be applicable and reliable.
    • Show Sections

      Hide Sections

      As the vocational education attains a rapid development in the modern times, there is an increasingly higher requirement on the teaching of the vocational teachers, and simultaneously the taste of people on the classroom teaching is changing constantly. Under such a situation, how should people make an evaluation on the classroom teaching quality of the teachers in vocational colleges? In this paper, in accordance with the theory of the fuzzy comprehensive judgment, a justice and objective evaluation is made by the authors on the classroom teaching of the teachers in vocational colleges, for the ultimate purpose of improving the quality of the vocational education.
    • Show Sections

      Hide Sections

      The load distribution is the core of the rolling schedule pre-calculation. The reasonable load distribution can bring the mill production capacity into full play, stabilize the production process and improve the quality of products. The distribution based on the targeted proportion is one of the most important distribution methods, and its mathematical model is essentially solving the nonlinear equation sets. In this paper, the authors make a comparative analysis on the representative algorithms at home and abroad, and hence put forward a new algorithm—an improved Newton's method. This algorithm not only has the good convergence feature of the Newton's method, and also can prevent the solution of Acobian inverse matrix. Through the simulation results, the calculation speed is quite quick, and the accuracy is relatively high, so it can meet the online requirements.
    • Show Sections

      Hide Sections

      It is found that there are definition and introduction written language characteristics in the teaching materials. In this paper, the authors make an analysis on the teaching cases for the purposes of discovering the searching problems in designing the definitions of the terms in multiple disciplines such as the skills of leafing through the titles, and also research the explanation searching operations, and explore the method of searching the fundamental titles of the strange nouns in the major courses. There are three operations in searching the characteristics of the taught courses. And they are leafing through the book pages, reading the paragraphs, and extracting some typical contents to be explained in details. In the mean time, the authors research and develop the skills of reading words, things, explanations and fine distinctions to resolve the difficulties transformation problems in the stagflation of preparing lessons. Therefore, the ultimate purpose of making the accurate preparation of lessons can be achieved.
    • Show Sections

      Hide Sections

      Tian-Tai county in Zhejiang province is one of 26 less-developed regions, which located in the district of the mountains, many people and little land, obvious contradiction between human and land. Earlier this year, the county government proposed and implemented the development strategy of “small county big city”, and realized the great-leap-forward development, the fight with the synchronous ahead to basic modernization. Based on the analysis of the background and significance of Tiantai County, put forward the strategy of “small county big city”. It contains five basic connotation, thus proposed the idea and suggestion of “small county big city” strategy of county development overall orientation, county space layout, industrial concentration , urban grade ascension, social public service facilities.
    • Show Sections

      Hide Sections

      The paper indicates problems with fund raising as for country health care, small and medium-sized enterprises, rural social endowment insurance and so on, groundbreaking to normalizes and structuralizes it to a multi-game problem. Makes analysis about Nash equilibrium strategy and equilibrium solution of this multi-game model, gets some important value results. It has a significance value and has a important instructive for lay down fund raising policy; in addition, the paper analysis and research fund raising with game theory breakthrough, and it also a contribution about finance game theory. At last, the paper makes actually test, give a good explains about rationality and scientific of the paper.
    • Show Sections

      Hide Sections

      In this paper, we proposed that the selective use of carry-save arithmetic can accelerate a variety of arithmetic-dominated circuits. Carry-save arithmetic occurs naturally in a variety of DSP applications whereas Field-programmable gate-arrays (FPGAs), however, are not particularly well suited to carry-save arithmetic. To address this concern, we introduce the “Field Programmable Counter Array ”(FPCA), an accelerator for carry-save arithmetic intended for integration into an FPGA. And this FPCA can be built by using Generalized Parallel Counter (GPC). The proposed GPC reduces the complexity of carry-save arithmetic compared to basic adders.
  • Session 10 Electronic Design Automation

    • Show Sections

      Hide Sections

      This study purposed a low cost data acquisition system based on PIC32MX460F512L microcontrollers for Remote Data Acquisition System (RDAQ) like wind energy where moderate accuracy is required. Data acquisition is performed by inbuilt ten bits ADC converter of the microcontroller. Time mechanism is provided by in built Real Time Clock (RTC). An ENC28J60 Ethernet controller is used for interfacing with Internet network. This system also stores data in USB flash drive for backup data to increase reliability of system.
    • Show Sections

      Hide Sections

      Forward Error Correction (FEC) is a developed system of error control applied in data transmission. It is often applied in a kind of system that is difficult and costly to build a reverse channel for the receiver to detect and correct data error. The following pages includes the basic function and background theory of FEC and some examples of FEC codes. We also represent some research findings on the application in Internet and wireless data transmission. Furthermore, in the last few paragraphs we bring forward the information of FEC's lately development.
    • Show Sections

      Hide Sections

      A virtual instrument to diagnose the deformities in lungs is presented. This VI will enable to give information about the deformities in lungs by analyzing the volume and sound information of a patient's lungs. Pulmonary Function Testing has been a major step forward in assessing the functional status of the lungs. PFT can be used to diagnose the presence of obstructive or restrictive lung diseases. A virtual instrument to diagnose the deformities in lungs is presented. This VI will enable to give information about the deformities in lungs by analyzing the volume and sound information of a patient's lungs. Pulmonary Function Testing has been a major step forward in assessing the functional status of the lungs. PFT can be used to diagnose the presence of obstructive or restrictive lung diseases.
    • Show Sections

      Hide Sections

      In this paper, we have proposed a novel design of Microcontroller based user-interactive futuristic displays based on the Propeller Clock Display. It was designed to allow the user to alter the display by finger gestures. We demonstrate changing the time and the type of display dynamically at runtime. The design makes use of the fact that an image persists on the human brain for 1/16th of a second and hence, can be employed for cost-effective ways to create fascinating displays through micro-controllers. The work, thus, successfully investigates the possibility of creating such displays wherever there is a relative speed (linear or circular). Such displays could also be installed in wheels of automobiles, underground subways (utilizing linear motion of trains), ceiling fans, etc. using the same basic principles which are discussed in this paper.
    • Show Sections

      Hide Sections

      This paper presents a novel bitstream generator tool of multiple functions, it was developed to support a radiation-hardened SOI FPGA (VS1000), which has been fabricated with 0.5um SOI-CMOS process. This tool is a part of the VS1000 automation design software system (VDK), and has multiple functions. Like other bitstream generator, it can generate a basic bitstream with whole chip configurable contents for programming FPGA, it can also supply partial bitstreams, readback command bitstreams, as well as change FPGA chip configurable property. This tool serves for FPGA full-chip verification, package test, normal test, radiation test and application. The VS1000 FPGA tests above illustrates that this bitstream generator tool works well.
    • Show Sections

      Hide Sections

      Immethodical and irregular urban growth is a threat to the sustainable housing and sustainable urban management. Therefore it is very important to safeguard frontage and borders of contemporary cities, although this is a time consuming, expensive and a critical task. Especially in metropolitan areas and great cities like Tehran.The paper is looking for a model for usage of smart technologies as assistant for urban management. The proposed plan is based on usage of control systems and applications with the help of sensor networking, to help Tehran municipality to safeguard the city frontage and prohibit undecided and irregular urban growth.
  • Session 11 Intelligent Network and Computing

    • Show Sections

      Hide Sections

      In recent years, the smart phones and tablet PCs are already ubiquitous in our daily life. There is no denying that finger-based touch input becomes the major interaction modality for the smart phones users. Also, the finger paint is one of the most commonly used operations. In Android platform, the finger paint is mainly implemented by the motion event class. However, it is limited by two points as follows. (1) When user's gesture is in a slow motion mode, too many coordinate points are produced and received. It will lead to a heavy computing and users may spend long time waiting. 2) The received coordinate points from the gesture may be discontinuous. It raises difficulty in gesture storage and later application development. In order to improve user experience, an efficient component-based finger paint scheme for mobile devices is proposed. First, three actions of motion event are set as a component and meanwhile, the reduction procedure is used for this component. By this way, the loading of the storage can be improved and reduces the computation time in later application processing. After that, an efficient existing line-line segment intersection scheme is modified to overcome the limits of motion event class in the Android platform and to achieve an efficient finger paint technique. In the experimental results, the proposed scheme is integrated into E-Reader application to implement a component-based annotation scheme and it proves that the proposed scheme is able to overcome the limits of motion event in the Android platform and provide better performance to user experience.
    • Show Sections

      Hide Sections

      In the area of Wireless Sensor Networks, the maximization of network life is a critical issue. In the newly-emerging Wireless Multimedia Sensor Network, the high volume of sensed audio and video data needs to be compressed before transmission. There exists a tradeoff between the power consumptions of data compression and data transmission. Thus, how to reach an optimized balance between compression and transmission and maximize network life becomes a challenging research issue. In this research, we propose mathematical models which describe power consumptions of data compression and transmission of sensor nodes in hexagon-shaped clusters. Under the proposed model, we have achieved the optimized data compression rate which can minimize the overall power consumption of the whole cluster. The subdivision yields considerable saving in overall power consumption of the cluster, and the saving is heavily dependent on the nodes' transmission range and their deployment density.
    • Show Sections

      Hide Sections

      This paper, we propose a new technique for calculated Contention Window (CW) size for backoff algorithm mode. We use maximum function theory for improved performance of contention window algorithms for IEEE802.11e quality of service (QoS) wireless local area network (WLAN). The proposed algorithm is named as Active Node Back-off Algorithm (ANBA). The performance of ANBA is compared with the legacy back-off algorithms as Binary Exponential Back-off (BEB) and Estimation Based Back-off (EBB) algorithm. The throughput and fairness index of ANBA is analyzed in arbitration inter-frame space (AIFS) priority technique in enhanced distribute channel access function (EDCAF). In this research, the channel access scheme is based on carrier sense multiple access collision avoidance with request-to-send and clear-to-send (CSMA/CA RTS CTS) protocol. Our numerical results show that the performance of ANBA algorithm is better than old back-off algorithm in terms throughput and fairness index parameters.
    • Show Sections

      Hide Sections

      In this paper, we are proposed algorithm of target position in mobile wireless channel that has estimated the direction of arrival. The proposed algorithm applied the spatial average method in a MUSIC algorithm. The diagonal matrix of the spatial average method was changed to inverse the matrix and to obtain a new signal correlation matrix. The existing algorithm was analyzed and compared by applying a proposed signal correlation matrix to estimate the direction of arrival in a MUSIC algorithm. The experiment resulted in a proposed algorithm with a linear predicted algorithm and Min-norm algorithm resolution at more than 5o. It was improved more than 2o in a MUSIC algorithm.
    • Show Sections

      Hide Sections

      In the last few years, there has been a dramatic increase in the adoption rate of smartphones in the world and many experts predict that, by the end of this year, Google's Android will become the dominant operating system for these devices. It is not surprising, therefore, that Android phones are a popular target for malicious hackers. These hackers often make use of rootkits to achieve their goal and this paper provides a thorough analysis of rootkit attacks on Android-based smartphones.
    • Show Sections

      Hide Sections

      The swift progression in computing has enabled the growth of low cost wireless sensor networks (WSNs). In the past few years, much research effort has been put into view to implement the physical world with a large number of networked sensor nodes that are cooperating despite the fact of self-configuring. Wireless sensor networks produce a huge quantity of data that needs to be processed, delivered, and measured according to the application objectives. Data storage has become an important issue in sensor networks as a large amount of collected data need to be archived for future information retrieval. This paper review Sensor data Storages to find best of them and use it to control our physical world better.
    • Show Sections

      Hide Sections

      Sensor networks are condensed wired or wireless networks for gathering and publishing environmental data. They contain of a large amount of sensor nodes that are linked to each other. Sensor networks produce huge volume of data which needs advanced analytical processing and interpretation by machines. Processing and interpretation of huge volumes of heterogeneous sensor data effects on the length of life time of sensor networks. In this paper we evaluate two known methods, names Semantic Sensor Web (SSW) and Semantic Sensor Observation Service (SemSOS), that stores sensor data in semantically form; annotate sensor data with semantic metadata. In other words we compare two known systems with consumption rate of energy and we show that the more data transfers, less sensor network life time results and SSW consumes less energy than SemSOS.
    • Show Sections

      Hide Sections

      This paper presents a mineral identification method using Image enhancement methods, perception algorithm and Support Vector Machine theory. First, normalization and image enhancement technology are used to process image, and then perceptron algorithm is followed to identify ore automatically, last Support Vector Machine learning mechanism is used to improve the recognition rate of ore. Experimental results show that recognition rate of Digital image processing mineral identification method is more than 90% and it can be effectively applied in the field of mineral identification.
    • Show Sections

      Hide Sections

      “Message Routing” is one of the important issues in computer networks. Routing with fault-tolerant has an important effect on the fast exchange of information in such networks. In this article, four types of ants are used for fault-tolerant routing in mesh networks. From these four types of ants, two types are for routing and the two other types are for fault-finding, discovering new routes and finding the shortest path dynamically. Finally, the proposed method is compared with fault-tolerant routing algorithm in mesh networks using the balanced ring. Simulation results showed that this method reacted quickly about network faults, meanwhile in each time step the data can choose the optimal path to reach their destination.
    • Show Sections

      Hide Sections

      E-shopping has grown in popularity over the years, mainly because people find it convenient and easy to buy various items comfortably from their office or home. But the current e-shopping systems are not efficient. So, the main aim of this paper is to propose a personalized e-shopping system, which makes use of agent technology to enhance the automation and efficiency of shopping process in Internet commerce. The agent technology is used to enhance the customer's needs which include availability, speedy response time, and efficiency. Agent for e-Shopping creates connectivity on an anytime-anywhere-any-device-basis to provide the specific goods required by the consumers based on optimization and scalability and it performs better than the systems which do not use the agent technology.
    • Show Sections

      Hide Sections

      The intense increase of video surveillance system use reveals a big lack of privacy protection even in state-of-the-art systems. This work is part of a larger endeavor to develop a privacy oriented video surveillance system. One part of the Privacy Enhancing Video Surveillance system PEVS are tracking methods that track objects to be selectively privacy protected. In this paper, a tracking algorithm, mean shift clustering, is analyzed and its accuracy increased by proposing to use exponential kernels instead of uniform kernels. The increase of accuracy is further validated and reveals a 40% increase of accuracy by using this proposal.
    • Show Sections

      Hide Sections

      This work presents the closure of a PhD thesis on high-performance video surveillance at the University of Oulu, Finland. The complete process of video surveillance and every single component is analyzed, introducing a blue print for the future video surveillance systems. The focus of this paper will be put on one key element: A novel user-interface device, the Advanced Video Surveillance Local Control — AVSLC: Access, control of large-scale video surveillance systems. Implementation and execution of the Security Management Process drastically improve information access and reaction speed in critical situations in the field. All innovations presented were validated in real-world case studies.
    • Show Sections

      Hide Sections

      Timeliness is one of the main elements in assuring the usefulness of handheld applications. Therefore, this study aimed at exploring the timeliness measures and investigating the importance of these measures in quantifying the timeliness of handheld application usage. As a result, a total number of thirteen measures were found highly associated and positively correlated towards measuring the timeliness of handheld application usage.

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In