Different Complexity (different + complexity)

Distribution by Scientific Domains

Selected Abstracts

Particle-in-Cell Simulation of Stationary Plasma Thruster

F. Taccogna
Abstract A very good example for the application of PIC techniques for detailed studies of low-temperature plasmas is the Hall thrusters. Here, a variety of models with different complexities are needed to get better insight into the physics of these systems. Particular emphasis has been spent for the geometrical scaling, for the simulation of the plasma-wall interaction inside the acceleration channel and for ion-neutral collision into the plume emitted from the thruster. Results show the axial acceleration mechanism, the secondary electron emission instability, the azimuthal fluctuations into the channel and the ion backflow and electron trapping in the plume. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim) [source]

An automated approach for abstracting execution logs to execution events

Zhen Ming Jiang
Abstract Execution logs are generated by output statements that developers insert into the source code. Execution logs are widely available and are helpful in monitoring, remote issue resolution, and system understanding of complex enterprise applications. There are many proposals for standardized log formats such as the W3C and SNMP formats. However, most applications use ad hoc non-standardized logging formats. Automated analysis of such logs is complex due to the loosely defined structure and a large non-fixed vocabulary of words. The large volume of logs, produced by enterprise applications, limits the usefulness of manual analysis techniques. Automated techniques are needed to uncover the structure of execution logs. Using the uncovered structure, sophisticated analysis of logs can be performed. In this paper, we propose a log abstraction technique that recognizes the internal structure of each log line. Using the recovered structure, log lines can be easily summarized and categorized to help comprehend and investigate the complex behavior of large software applications. Our proposed approach handles free-form log lines with minimal requirements on the format of a log line. Through a case study using log files from four enterprise applications, we demonstrate that our approach abstracts log files of different complexities with high precision and recall. Copyright © 2008 John Wiley & Sons, Ltd. [source]

Embryonic keratinization in vertebrates in relation to land colonization

ACTA ZOOLOGICA, Issue 1 2009
L. Alibardi
Abstract The embryogenesis and cytology of the epidermis in different vertebrates is variable in relation to the formation of a stratum corneum of different complexity. The latter process was essential for land colonization during vertebrate evolution and produced an efficient barrier in amniotes. Keratinocytes are made of cross-linked keratins associated with specific proteins and lipids that are produced at advanced stages of embryogenesis when the epidermis becomes stratified. In these stages the epidermis changes from an aquatic to a terrestrial type, preadapted in preparation for the impact with the dry terrestrial environment that occurs at hatching or parturition. The epidermal barrier against water-loss, mechanical and chemical stress, and microbe penetration is completely formed shortly before birth. Beneath the outer periderm, variably stratified embryonic layers containing glycine-rich alpha-keratins are formed in preparation for adult life. The following layers of the epidermis produce proteins for the formation of the cornified cell membrane and of the cornified material present in keratinocytes of the adult epidermis in reptiles, birds and mammals. The general features of the process of soft cornification in the embryonic epidermis of vertebrates are presented. Cornification in developing scales in reptiles, avian feathers and mammalian hairs is mainly related to the evolution of keratin-associated proteins. The latter proteins form the resistant matrix of hard skin derivatives such as claws, beaks, nails and horns. [source]

1-Hz repetitive TMS over ipsilateral motor cortex influences the performance of sequential finger movements of different complexity

Laura Avanzino
Abstract To elucidate the role of ipsilateral motor cortex (M1) in the control of unilateral finger movements (UFMs) in humans we used a conditioning protocol of 1-Hz repetitive transcranial magnetic stimulation (1-Hz rTMS) over M1 in 11 right-handed healthy subjects. We analysed the effects of conditioning rTMS on UFMs of different complexity (simple vs sequential finger movements), and performed with a different modality (internally vs externally paced movements). UFMs were monitored with a sensor-engineered glove, and a quantitative evaluation of the following parameters was performed: touch duration (TD); inter-tapping interval (ITI); timing error (TE); and number of errors (NE). 1-Hz rTMS over ipsilateral M1 was able to affect the performance of a sequence of finger opposition movements in a metronome-paced condition, significantly increasing TD and reducing ITI without TE changes. The effects on motor behaviour had a different magnitude as a function of the sequence complexity. Further, we found a different effect of the ipsilateral 1-Hz rTMS on externally paced movements with respect to an internally paced condition. All these findings indicate that ipsilateral M1 plays an important role in the execution of sequential UFMs. Interestingly, NE did not change in any experimental condition, suggesting that ipsilateral M1 influences only the temporal and not the spatial accuracy of UFMs. Finally, the duration (up to 30 min) of 1-Hz rTMS effects on ipsilateral M1 can indicate its direct action on the mechanisms of cortical plasticity, suggesting that rTMS can be used to modulate the communication between the two hemispheres in rehabilitative protocols. [source]

Nutrient fluxes at the river basin scale.

II: the balance between data availability, model complexity
Abstract In order to model complex environmental systems, one needs to find a balance between the model complexity and the quality of the data available needed to run and validate the model. This paper describes a method to find this balance. Four models of different complexity were applied to describe the transfer of nitrogen and phosphorus from pollution sources to river outlets in two large European river basins (Rhine and Elbe). A comparison of the predictive capability of these four models tells us something about the added value of the added model complexity. We also quantified the errors in the data that were used to run and validate the models and analysed to what extent the model validation errors could be attributed to data errors, and to what extent to shortcomings of the model. We conclude that although the addition of more process description is interesting from a theoretical point of view, it does not necessarily improve the predictive capability. Although our analysis is based on an extensive pollution-sources,river-load database it appeared that the information content of this database was sufficient only to support models of a limited complexity. Our analysis also illustrates that for a proper justification of a model's degree of complexity one should compare the model to simplified versions of the model. Copyright © 2001 John Wiley & Sons, Ltd. [source]

Model Based Control of a Parallel Robot , A Comparison of Control Algorithms

Hubert Hahn Prof. Dr.Article first published online: 25 MAR 200
In this contribution the control behavior of a special construction of a parallel robot, called multi-axes test facility, is investigated. After a brief discussion of the different tasks of the robot the construction of the robot is briefly presented. To solve the tasks, different control algorithms are derived based on model equations of different complexity of the robot. Depending on the task to be performed by the robot, the controllers compensate the kinematic and/or kinetic coupling of the degrees of freedom of the robot, stabilize the system and achieve the desired spatial motion of each degree of freedom as well as sufficient robustness with respect to parameter uncertainties and load variations. A few results obtained in computer simulations and laboratory experiments are presented and judged with respect to the quality of control, the closeness to reality of the computer simulations, and the amount of costs and work needed to realize the different solutions. [source]

Complexity Analysis Based on Image-Processing Method and Pixelized Recognition of Chinese Characters Using Simulated Prosthetic Vision

Kun Yang
Abstract The influence of complexity and minimum resolution necessary for recognition of pixelized Chinese characters (CCs) was investigated by using simulated prosthetic vision. An image-processing method was used to evaluate the complexity of CCs, which is defined as the frequency of black pixels and analyzed by black pixel statistic complexity algorithm. A total of 631 most commonly used CCs that can deliver 80% of the information in Chinese daily reading were chosen as the testing database in order to avoid the negative effect due to illegibility and incognizance. CCs in Hei font style were captured as images and pixelized as 6 × 6, 8 × 8, 10 × 10, and 12 × 12 pixel arrays with square dots. Recognition accuracy of CCs with different complexity and different numbers of pixel arrays was tested by using simulated prosthetic vision. The results indicate that both pixel array number and complexity have significant impact on pixelized reading of CCs. Recognition accuracy of pixelized CCs drops with the increase of complexity and the decrease of pixel number. More than 80% of CCs with any complexity can be recognized correctly; 10 × 10 pixel array can sufficiently provide pixelized reading of CCs for visual prosthesis. Pixelized reading of CCs with low resolution is possible only for characters with low complexity (complexity less than 0.16 for a 6 × 6 pixel array and less than 0.24 for an 8 × 8 pixel array). [source]

Approximation and complexity trade-off by TP model transformation in controller design: A case study of the TORA system,

Zoltán Petres
Abstract The main objective of the paper is to study the approximation and complexity trade-off capabilities of the recently proposed tensor product distributed compensation (TPDC) based control design framework. The TPDC is the combination of the TP model transformation and the parallel distributed compensation (PDC) framework. The Tensor Product (TP) model transformation includes an Higher Order Singular Value Decomposition (HOSVD)-based technique to solve the approximation and complexity trade-off. In this paper we generate TP models with different complexity and approximation properties, and then we derive controllers for them. We analyze how the trade-off effects the model behavior and control performance. All these properties are studied via the state feedback controller design of the Translational Oscillations with an Eccentric Rotational Proof Mass Actuator (TORA) System. Copyright © 2010 John Wiley and Sons Asia Pte Ltd and Chinese Automatic Control Society [source]