Methodology

Distribution by Scientific Domains

Kinds of Methodology

  • action research methodology
  • alternative methodology
  • analysis methodology
  • analytic methodology
  • analytical methodology
  • appropriate methodology
  • assessment methodology
  • bayesian methodology
  • case study methodology
  • collection methodology
  • computational methodology
  • control methodology
  • culture methodology
  • current methodology
  • data methodology
  • design methodology
  • detection methodology
  • developed methodology
  • development methodology
  • different methodology
  • efficient methodology
  • empirical methodology
  • evaluation methodology
  • event study methodology
  • existing methodology
  • experimental methodology
  • extraction methodology
  • focus group methodology
  • general methodology
  • grounded theory methodology
  • group methodology
  • hybrid methodology
  • identification methodology
  • improved methodology
  • innovative methodology
  • management methodology
  • mapping methodology
  • modeling methodology
  • modelling methodology
  • new methodology
  • new synthetic methodology
  • nmr methodology
  • novel methodology
  • numerical methodology
  • optimization methodology
  • other methodology
  • panel data methodology
  • participatory methodology
  • phenomenological methodology
  • powerful methodology
  • present methodology
  • presented methodology
  • promising methodology
  • proposed methodology
  • proteomic methodology
  • qualitative methodology
  • qualitative research methodology
  • quantitative methodology
  • reliable methodology
  • research methodology
  • response surface methodology
  • rigorous methodology
  • risk assessment methodology
  • robust design methodology
  • robust methodology
  • same methodology
  • sampling methodology
  • science methodology
  • scientific methodology
  • screening methodology
  • several methodology
  • similar methodology
  • simple methodology
  • solution methodology
  • specific methodology
  • standard methodology
  • standardized methodology
  • statistical methodology
  • straightforward methodology
  • study methodology
  • surface methodology
  • survey methodology
  • synthesis methodology
  • synthetic methodology
  • systematic methodology
  • testing methodology
  • theoretical methodology
  • theory methodology
  • traditional methodology
  • useful methodology
  • various methodology

  • Terms modified by Methodology

  • methodology available
  • methodology suitable
  • methodology used

  • Selected Abstracts


    High-Risk Cutaneous Squamous Cell Carcinoma without Palpable Lymphadenopathy: Is There a Therapeutic Role for Elective Neck Dissection?

    DERMATOLOGIC SURGERY, Issue 4 2007
    JUAN-CARLOS MARTINEZ MD
    PURPOSE The beneficial role of elective neck dissection (END) in the management of high-risk cutaneous squamous cell carcinoma (CSCC) of the head and neck remains unproven. Some surgical specialists suggest that END may be beneficial for patients with clinically node-negative (N0) high-risk CSCC, but there are few data to support this claim. We reviewed the available literature regarding the use of END in the management of both CSCC and head and neck SCC (HNSCC). METHODOLOGY The available medical literature pertaining to END in both CSCC and HNSCC was reviewed using PubMed and Ovid Medline searches. RESULTS Many surgical specialists recommend that END be routinely performed in patients with N0 HNSCC when the risk of occult metastases is estimated to exceed 20%; however, patients who undergo END have no proven survival benefit over those who are initially staged as N0 and undergo therapeutic neck dissection (TND) after the development of apparent regional disease. There is a lack of data regarding the proper management of regional nodal basins in patients with N0 CSCC. In the absence of evidence-based data, the cutaneous surgeon must rely on clinical judgment to guide the management of patients with N0 high-risk CSCC of the head and neck. CONCLUSIONS Appropriate work-up for occult nodal disease may occasionally be warranted in patients with high-risk CSCC. END may play a role in only a very limited number of patients with high-risk CSCC. [source]


    RESPONSE SURFACE METHODOLOGY FOR STUDYING THE QUALITY CHARACTERISTICS OF COWPEA (VIGNA UNGUICULATA)-BASED TEMPEH

    JOURNAL OF FOOD PROCESS ENGINEERING, Issue 4 2010
    GEORGE AMPONSAH ANNOR
    ABSTRACT Response surface methodology was used to optimize the processing conditions in the preparation of cowpea tempeh. The independent factors studied were boiling time (varying from 5 to 30 min), incubation time (varying from 12 to 48 h) and incubation temperatures (varying from 25 to 50C), whereas the dependent factors were protein content, protein solubility, pH, titratable acidity and total color difference (using L, a* and b*). Regression models were generated and adequacy was tested with regression coefficients (R2) and the lack-of-fit tests. Optimum processing conditions were determined by method of superimposition. There was a strong and significant influence (P < 0.01) of the quadratic effect of the incubation time on the protein content of the cowpea tempeh, with similar significance (P < 0.01) noted in protein solubility with increasing boiling time. The optimum processing conditions observed for the preparation of cowpea tempeh were boiling time of about 20 min, incubation time of about 28 h and incubation temperature of about 37C. PRACTICAL APPLICATIONS Response surface methodology (RSM), as a statistical tool, has been effectively used in food process applications. This study embraced the use of RSM in the optimization of the processing conditions involved in the preparation of cowpea tempeh. Superimposition of the contour plots developed from the regression models indicated that cowpea with optimum quality characteristics should be processed at a boiling time of 20 min, incubation time of 28 h and incubation temperature of 37C. These conditions could be adopted for the industrial production of cowpea tempeh. [source]


    OPTIMIZATION OF PRE-FRY DRYING OF YAM SLICES USING RESPONSE SURFACE METHODOLOGY

    JOURNAL OF FOOD PROCESS ENGINEERING, Issue 4 2010
    OLAJIDE PHILIP SOBUKOLA
    ABSTRACT The effect of convective hot-air drying pretreatment and frying time at a frying temperature of 170 ± 1C on moisture and oil contents, breaking force (crispness) and color parameters of yam chips was investigated. Response surface methodology technique was used to develop models for the responses as a result of variation in levels of drying temperature (60,80C), drying time (1,5°min) and frying time (2,6°min). Drying pretreatment had a significant effect on oil and moisture contents, breaking force and color parameters of yam chips, with water removal exhibiting a typical drying profile. Response surface regression analysis shows that responses were significantly (P < 0.05) correlated with drying temperature and time and frying time. The optimum pre-fry drying condition observed was a drying temperature of 70,75C for about 3,4 min while frying for 4,5 min. PRACTICAL APPLICATIONS Deep-fat frying is a very important cooking method and a lot of effort has been devoted to manufacturing fried products with lower oil content and acceptable quality parameters. The information provided in this work will be very useful in manufacturing fried yam chips of acceptable quality attributes through the combination of drying pretreatment conditions. The result is very useful in considering different processing variables and responses at the same time as compared with single factor experiment common in the literature. [source]


    OPTIMIZATION OF NEW FLOUR IMPROVER MIXING FORMULA BY SURFACE RESPONSE METHODOLOGY

    JOURNAL OF FOOD PROCESS ENGINEERING, Issue 2 2010
    RAOUDHA ELLOUZE GHORBEL
    ABSTRACT In the present study, we search to improve the viscoelastic properties of wheat flour characterized by a low bread-making quality. Six regulators were tested: broad bean flour, gluten, monodiglyceride (MDG), ascorbic acid, sodium alginate and a mixture of amylase and xylanase. A hybrid design was carried out in order to study the effect of these regulators on the alveographic properties of wheat flour dough. Two alveographic responses (W: deformation energy and P/L: elasticity-to-extensibility ratio) were studied and simultaneously optimized via the desirability functions. An optimal mixture, containing 13.17 g/kg of broad bean flour, 15.13 g/kg of gluten, 0.155 g/kg of ascorbic acid, 3.875 g/kg of MDG, 2.75 g/kg of sodium alginate and 0.3 g/kg of enzyme mixture, was obtained and tested in a Tunisian flour. It led to a dough characterized by a W = 274 × 10,4 J and P/L = 0.74 versus 191 × 10,4 J and 0.40, respectively, for the Tunisian flour without improvers. PRACTICAL APPLICATIONS In this work, we developed a new flour improver mixing formula intended to be used with wheat flour characterized by a low bread-making quality. This improver mixture is in powder form and contains 13.17 g of broad bean flour, 15.13 g of gluten, 0.155 g of ascorbic acid, 3.875 g of monodiglyceride, 2.75 g of sodium alginate and 0.3 g of enzyme mixture per kilogram of wheat flour. The incorporation of this improver mixture in low bread-making quality wheatflour leads to an increase of its deformation energy (W) of about 43% and produces large volume bread. [source]


    OPTIMIZATION OF PERMEABILIZATION PROCESS FOR LACTOSE HYDROLYSIS IN WHEY USING RESPONSE SURFACE METHODOLOGY

    JOURNAL OF FOOD PROCESS ENGINEERING, Issue 3 2009
    GURPREET KAUR
    ABSTRACT To overcome the permeability barrier and prepare whole cell biocatalysts with high activities, permeabilization of Kluyveromyces marxianus var. lactis NCIM 3566 in relation to, -galactosidase activity was optimized using cetyltrimethylammonium bromide (CTAB) as permeabilizing agent. Permeabilized whole cells can be advantageous over pure enzyme preparations in terms of cost-effectiveness and increased stability maintained by the intracellular environment. Response surface methodology (RSM) was applied to optimize concentration of CTAB, temperature and the treatment time for maximum permeabilization of yeast cells. The optimum operating conditions for permeabilization process to achieve maximum enzyme activity obtained by RSM were 0.06% (w/v) CTAB concentration, 28C temperature and process duration of 14 min. At these conditions of process variables, the maximum value of enzyme activity was found to be 1,334 IU/g. The permeabilized yeast cells were highly effective and resulted in 90.5% lactose hydrolysis in whey. PRACTICAL APPLICATION , -Galactosidase is one of the most promising enzymes, which has several applications in the food, fermentation and dairy industry. However, the industrial applications of , -galactosidase have been hampered by the costs involved in downstream processing. The present investigation was focused on developing the low-cost technology for lactose hydrolysis based on permeabilization process. Disposal of lactose in whey and whey permeates is one of the most significant problems with regard to economics and environmental impact faced by the dairy industries. Keeping this in view, lactose hydrolysis in whey has been successfully performed using permeabilized Kluyveromyces marxianus cells. Hydrolysis of lactose using , -galactosidase converts whey into a potentially very useful food ingredient, which has immense applications in food industries. Its use has increased significantly in recent years, mainly in the dairy products and in digestive preparations. Lactose hydrolysis causes several potential changes in the manufacture and marketing of dairy products, including increased solubility, sweetness and broader fermentation possibilities. [source]


    APPLICATION OF RESPONSE SURFACE METHODOLOGY FOR THE OSMOTIC DEHYDRATION OF CARROTS

    JOURNAL OF FOOD PROCESS ENGINEERING, Issue 6 2006
    BAHADUR SINGH
    ABSTRACT Osmotic dehydrations of carrot cubes in sodium chloride salt solutions at different solution concentrations, temperatures and process durations were analyzed for water loss and solute gain. The osmotically pretreated carrot cubes were further dehydrated in a cabinet dryer at 65C and were then rehydrated in water at ambient temperature for 8,10 h and analyzed for rehydration ratio, color and overall acceptability of the rehydrated product. The process was optimized for maximum water loss, rehydration ratio and overall acceptability of rehydrated product, and for minimum solute gain and shrinkage of rehydrated product by response surface methodology. The optimum conditions of various process parameters were 11% salt concentration, 30C osmotic solution temperature and process duration of 120 min. [source]


    OPTIMIZATION OF NATTOKINASE PRODUCTION CONDUCTION USING RESPONSE SURFACE METHODOLOGY

    JOURNAL OF FOOD PROCESS ENGINEERING, Issue 1 2006
    DJA-SHIN WANG
    ABSTRACT Natto has attracted worldwide attention because of its health benefits and long history in Japanese food. It has been found that a potent fibrinolytic enzyme named nattokinase, which is extracted from natto, is able to prevent atherosclerosis. The production of nattokinase may be influenced by various factors such as temperature, shaking speed, volume of medium, fermentation time and so forth. Three-step response surface methodology was applied to obtain the optimal operation conditions of the fermentation process in order to maximize the nattokinase yield. The three major steps are described as follows. First, the important factors for fermentation were identified by L8 orthogonal array experiment. The chosen factors were temperature (37 or 45C), shaking speed (110 or 150 rpm), volume of medium (80 or 120 mL), Brix of wheat bran extract (1.5 or 3°), Brix of soy meal extract (1 or 2°), glucose concentration (0.6 or 1.2%) and fermentation time (24 or 36 h). Second, a regression equation was established between the response (i.e., the enzyme activity) and the two statistically significant factors (i.e., the volume of medium and fermentation time). Third, the optimal solutions for the volume of medium and fermentation time were obtained based on the response surface of the regression equation. According to the response surface analysis, the optimal operation conditions for the fermentation process should be 80 mL and 37.0817 h for the volume of medium and the fermentation time, respectively, which resulted in 459.11 FU/mL as the predicted enzyme activity. [source]


    CONSUMER-BASED OPTIMIZATION OF PEANUT-CHOCOLATE BAR USING RESPONSE SURFACE METHODOLOGY

    JOURNAL OF FOOD PROCESSING AND PRESERVATION, Issue 3-4 2005
    EDITH M. SAN JUAN
    ABSTRACT The acceptability of the sensory properties of a peanut-chocolate bar was optimized for consumer acceptance using response surface methodology. The factors studied included sugar, peanuts, cocoa powder and a process variable, degree of roast. Twenty-seven peanut-chocolate bar formulations with two replications were evaluated for consumer acceptance (n = 168) for overall liking and acceptance of color, appearance, flavor, sweetness and texture using 9-point hedonic scales. In terms of overall liking, the use of dark-roasted peanuts received the largest number of acceptable formulations when compared to the medium- and light-roasted peanuts. Sensory evaluation indicated that sweetness acceptance was the limiting factor for acceptability. An acceptable peanut-chocolate bar can be obtained by using formulations containing 44,54% dark-, medium- or light-roasted peanuts, 1,4% cocoa powder and 41,55% sugar. [source]


    OPTIMIZATION OF VACUUM PULSE OSMOTIC DEHYDRATION OF CANTALOUPE USING RESPONSE SURFACE METHODOLOGY

    JOURNAL OF FOOD PROCESSING AND PRESERVATION, Issue 1 2005
    WILMER J. FERMIN
    ABSTRACT The optimum levels of vacuum pressure, concentration of osmotic solution and dehydration time for vacuum pulse osmotic dehydration of cantaloupe were determined by response surface methodology (RSM). The response surface equations ( P < 0.05 and lack of fit > 0.1) explain the 97.6, 88.0 and 97.1% of the variability in weight loss, water loss and °Brix increase, respectively, at 95% confidence level. The canonical analysis for each response indicated that the stationary point is a saddle point for weight loss and °Brix increase, and a point of maximum response for water loss. The region that best satisfied all the constraints (low values in weight loss and °Brix increase, and high value in water loss) is located within the intervals from 49.5 °Brix to 52.5 °Brix for concentration and from 75 min to 84 min for dehydration time at a vacuum pulse of 740 mbar. [source]


    FORMULATION OF A SOY,COFFEE BEVERAGE BY RESPONSE SURFACE METHODOLOGY AND INTERNAL PREFERENCE MAPPING

    JOURNAL OF SENSORY STUDIES, Issue 2010
    ILANA FELBERG
    ABSTRACT Coffee consumers (n = 60) tasted and rated samples of a new soy,coffee beverage made from instant coffee, soymilk powder and sugar. Ingredient concentrations (independent variables) varied according to a 23central composite design for overall degree of acceptance. Data were analyzed by analysis of variance (ANOVA), least square difference and response surface methodology, followed by internal preference mapping (IPM) with cluster analysis. ANOVA from the consumers' acceptance data revealed that samples differed significantly (P , 0.05). Although soymilk content did not influence significantly the consumers' acceptance in the tested range, IPM with cluster analysis indicated that at least part of the acceptance differences was based on the soy beverage consumption habit. The final beverage formulation was evaluated cold and hot for overall acceptability (9-point structured hedonic scale) by 112 coffee consumers and the cold beverage reached a good acceptability mean score (6.2) among the participants. PRACTICAL APPLICATIONS The consumption of soy products has been reported to reduce the risk of several diseases and a number of recent studies have found beneficial health properties attributed to coffee. Considering the current consumer trend for healthier alternatives in food products, we decided to combine the health benefits of these two important Brazilian commodities in a functional beverage. In order to optimize the formulation and maximize sensory acceptance, we performed consumers' tests using response surface methodology. Internal preference mapping and cluster analyses were also applied to provide information on the variability of the consumer individual opinions and segment them in groups of similar preference criteria. [source]


    CURRENT-STATUS SURVIVAL ANALYSIS METHODOLOGY APPLIED TO ESTIMATING SENSORY SHELF LIFE OF READY-TO-EAT LETTUCE (LACTUCA SATIVA)

    JOURNAL OF SENSORY STUDIES, Issue 2 2008
    MABEL ARANEDA
    ABSTRACT The objective of the present work was to develop a method for predicting sensory shelf life for situations in which each consumer evaluates only one sample corresponding to one storage time. This type of data is known as current-status data in survival analysis statistics. The methodology was applied to estimate the sensory shelf life of ready-to-eat lettuce (Lactuca sativa var. capitata cv."Alpha"). For each of six storage times, 50,52 consumers answered yes or no to whether they would normally consume the presented sample. The results were satisfactory, showing that the methodology can be applied when necessary. The Weibull model was found adequate to model the data. Estimated shelf lives ± 95% confidence intervals were 11.3 ± 1.2 days and 15.5 ± 0.9 days for a 25% and a 50% consumer rejection probability, respectively. PRACTICAL APPLICATIONS When considering shelf-life evaluations by consumers, the first idea is to have each consumer evaluate six or seven samples with different storage times in a single session. To do this, a reverse storage design is necessary, and in the case of a product such as lettuce, it would lead to different batches being confused with storage times. The methodology proposed in this article avoids this problem by having each consumer evaluate a single sample. Another issue with consumers tasting several samples in a single session is how representative this situation is of real consumption. The present methodology allows for a consumer to take home, e.g., a bottle of beer with an established storage time, and later collecting the information as to whether they found the beer acceptable or not. This is a situation much closer to real consumption. [source]


    OPTIMIZATION OF A CHOCOLATE PEANUT SPREAD USING RESPONSE SURFACE METHODOLOGY (RSM)

    JOURNAL OF SENSORY STUDIES, Issue 3 2004
    C.A. CHU
    ABSTRACT Response surface methodology was used to optimize formulations of chocolate peanut spread. Thirty-six formulations with varying levels of peanut (25-90%), chocolate (5-70%) and sugar (5-55%) were processed using a three-component constrained simplex lattice design. The processing variable, roast (light, medium, dark) was also included in the design. Response variables, measured with consumers (n = 60) participating in the test, were spreadability, overall acceptability, appearance, color, flavor, sweetness and texture/mouthfeel, using a 9-point hedonic scale. Regression analysis was performed and models were built for each significant (p < 0.01) response variable. Contour plots for each attribute, at each level of roast, were generated and superimposed to determine areas of overlap. Optimum formulations (consumer acceptance rating of , 6.0 for all attributes) for chocolate peanut spread were all combinations of 29-65% peanut, 9-41% chocolate, and 17-36% sugar, adding up to 100%, at a medium roast. Verification of two formulations indicated no difference between predicted and observed values. [source]


    SO YOU ALREADY HAVE A SURVEY DATABASE?,A SEVEN-STEP METHODOLOGY FOR THEORY BUILDING FROM SURVEY DATABASES: AN ILLUSTRATION FROM INCREMENTAL INNOVATION GENERATION IN BUYER,SELLER RELATIONSHIPS

    JOURNAL OF SUPPLY CHAIN MANAGEMENT, Issue 4 2010
    SUBROTO ROY
    Across business disciplines, the importance of database research for theory testing continues to increase. The availability of data also has increased, though methods to analyze and interpret these data lag. This research proposes a method for extracting strong measures from survey databases by a progression from qualitative to quantitative techniques. To test the proposed method, this study uses the Industrial Marketing and Purchasing (IMP) survey database, which includes data from firms in several European countries. The proposed method consists of two phases and seven steps, as illustrated in the context of the firm's incremental innovation generation for buyer,seller relationships. This systematic progression moves from a broad but valid empirical case study to the development of a narrow and reliable measure of incremental innovation generation in the IMP database. The proposed method can use supply chain survey databases for theory development without requiring primary data collection, assuming certain conditions. [source]


    COMPROMISE PROGRAMMING METHODOLOGY FOR DETERMINING INSTREAM FLOW UNDER MULTIOBJECTIVE WATER ALLOCATION CRITERIA,

    JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION, Issue 5 2006
    Jenq-Tzong Shiau
    ABSTRACT: This paper presents a quantitative assessment framework for determining the instream flow under multiobjective water allocation criteria. The Range of Variability Approach (RVA) is employed to evaluate the hydrologic alterations caused by flow diversions, and the resulting degrees of alteration for the 32 Indicators of Hydrologic Alteration (IHAs) are integrated as an overall degree of hydrologic alteration. By including this index in the objective function, it is possible to optimize the water allocation scheme using compromise programming to minimize the hydrologic alteration and water supply shortages. The proposed methodology is applied to a case study of the Kaoping diversion weir in Taiwan. The results indicate that the current release of 9.5 m3/s as a minimum instream flow does not effectively mitigate the highly altered hydrologic regime. Increasing the instream flow would reduce the overall degree of hydrologic alteration; however, this is achieved at the cost of increasing the water supply shortages. The effects on the optimal instream flow of the weighting factors assigned to water supplies and natural flow variations are also investigated. With equal weighting assigned to the multiple objectives, the optimal instream flow of 26 m3/s leads to a less severely altered hydrologic regime, especially for those low-flow characteristics, thereby providing a better protection of the riverine environment. [source]


    THE VACUITY OF POSTMODERNIST METHODOLOGY

    METAPHILOSOPHY, Issue 3 2005
    Nicholas Shackel
    Abstract: Many of the philosophical doctrines purveyed by postmodernists have been roundly refuted, yet people continue to be taken in by the dishonest devices used in proselytizing for postmodernism. I exhibit, name, and analyse five favourite rhetorical manoeuvres: Troll's Truisms, Motte and Bailey Doctrines, Equivocating Fulcra, the Postmodernist Fox Trot, and Rankly Relativising Fields. Anyone familiar with postmodernist writing will recognise their pervasive hold on the dialectic of postmodernism and come to judge that dialectic as it ought to be judged. [source]


    A NEW TRUE ORTHO-PHOTO METHODOLOGY FOR COMPLEX ARCHAEOLOGICAL APPLICATION

    ARCHAEOMETRY, Issue 3 2010
    YAHYA ALSHAWABKEH
    Ortho-photo is one of the most important photogrammetric products for archaeological documentation. It consists of a powerful textured representation combining geometric accuracy with rich detail, such as areas of damage and decay. Archaeological applications are usually faced with complex object shapes. Compared with conventional algorithms, ortho-projection of such rough curved objects is still a problem, due to the complex description of the analytical shape of the object. Even using a detailed digital surface model, typical ortho-rectification software does not produce the desired outcome, being incapable of handling image visibility and model occlusions, since it is limited to 2.5-dimensional surface descriptions. This paper presents an approach for the automated production of true ortho-mosaics for the documentation of cultural objects. The algorithm uses precise three-dimensional surface representations derived from laser scanning and several digital images that entirely cover the object of interest. After identifying all model surface triangles in the viewing direction, the triangles are projected back on to all initial images to establish visibilities for every available image. Missing image information can be filled in from adjacent images that must have been subjected to the same true ortho-photo procedure. [source]


    ERROR SPACE MOTION CONTROL METHODOLOGY FOR COMPLEX CONTOURS

    ASIAN JOURNAL OF CONTROL, Issue 1 2005
    Robert G. Landers
    ABSTRACT Motion control is a critical component of many engineering systems (e.g., manufacturing, robotics). Most systems have standard interpolation and control schemes for linear and circular contours; therefore, complex contours are often decomposed into a series of line segments and circular arcs. However, there are discontinuities where the line segments and arcs are joined together, and time to complete the contour is substantially increased when acceleration/deceleration interpolation schemes are employed. A motion control scheme known as the error space motion control methodology is proposed in this paper to design servomechanism motion control systems that may be utilized for complex contours. The error space motion control methodology is applied to a two-axis motion control system and simulation studies are conducted for linear, circular, elliptical, and limacon contours. The results demonstrate the excellent tracking ability of the proposed error space motion control methodology and its utility for complex contours. [source]


    Interactive Graphics for Computer Adaptive Testing

    COMPUTER GRAPHICS FORUM, Issue 8 2009
    I. Cheng
    K.3.1 [Computer Milieux]: Computer and Education , Computer Uses in Education; I.3.8 [Computing Methodologies]: Computer Graphics , Application Abstract Interactive graphics are commonly used in games and have been shown to be successful in attracting the general audience. Instead of computer games, animations, cartoons, and videos being used only for entertainment, there is now an interest in using interactive graphics for ,innovative testing'. Rather than traditional pen-and-paper tests, audio, video and graphics are being conceived as alternative means for more effective testing in the future. In this paper, we review some examples of graphics item types for testing. As well, we outline how games can be used to interactively test concepts; discuss designing chemistry item types with interactive 3D graphics; suggest approaches for automatically adjusting difficulty level in interactive graphics based questions; and propose strategies for giving partial marks for incorrect answers. We study how to test different cognitive skills, such as music, using multimedia interfaces; and also evaluate the effectiveness of our model. Methods for estimating difficulty level of a mathematical item type using Item Response Theory (IRT) and a molecule construction item type using Graph Edit Distance are discussed. Evaluation of the graphics item types through extensive testing on some students is described. We also outline the application of using interactive graphics over cell phones. All of the graphics item types used in this paper are developed by members of our research group. [source]


    Characteristics of Important Stopover Locations for Migrating Birds: Remote Sensing with Radar in the Great Lakes Basin

    CONSERVATION BIOLOGY, Issue 2 2009
    DAVID N. BONTER
    ave terrestre migratoria; migración; radar; sitios de escala temporal; WSR-88D Abstract:,A preliminary stage in developing comprehensive conservation plans involves identifying areas used by the organisms of interest. The areas used by migratory land birds during temporal breaks in migration (stopover periods) have received relatively little research and conservation attention. Methodologies for identifying stopover sites across large geographic areas have been, until recently, unavailable. Advances in weather-radar technology now allow for evaluation of bird migration patterns at large spatial scales. We analyzed radar data (WSR-88D) recorded during spring migration in 2000 and 2001 at 6 sites in the Great Lakes basin (U.S.A.). Our goal was to link areas of high migrant activity with the land-cover types and landscape contexts corresponding to those areas. To characterize the landscapes surrounding stopover locations, we integrated radar and land-cover data within a geographic information system. We compared landscape metrics within 5 km of areas that consistently hosted large numbers of migrants with landscapes surrounding randomly selected areas that were used by relatively few birds during migration. Concentration areas were characterized by 1.2 times more forest cover and 9.3 times more water cover than areas with little migrant activity. We detected a strong negative relationship between activity of migratory birds and agricultural land uses. Examination of individual migration events confirmed the importance of fragments of forested habitat in highly altered landscapes and highlighted large concentrations of birds departing from near-shore terrestrial areas in the Great Lakes basin. We conclude that conservation efforts can be more effectively targeted through intensive analysis of radar imagery. Resumen:,Una etapa preliminar en el desarrollo de planes de conservación integrales implica la identificación de áreas utilizadas por los organismos de interés. Las áreas utilizadas por aves terrestres migratorias durante escalas temporales en la migración (períodos de parada) han recibido relativamente poca atención de investigación y conservación. Hasta hace poco, las metodologías para la identificación de sitios de parada en áreas geográficas extensas han sido escasas. Ahora, los avances en la tecnología de radar meteorológico permiten la evaluación de patrones de migración de aves en escalas espaciales grandes. Analizamos datos de radar (WSR-88D) registrados en seis sitios en la cuenca de los Grandes Lagos (E.U.A.) durante la migración en las primaveras de 2000 y 2001. Nuestra meta fue relacionar áreas con gran actividad migratoria con los tipos de cobertura de suelo y los contextos del paisaje correspondientes a esas áreas. Para caracterizar los paisajes circundantes a las localidades de parada, integramos los datos de radar y de cobertura de suelo a un sistema de información geográfica. Comparamos las medidas del paisaje en un radio de 5 km en las áreas que consistentemente albergaron a grandes números de migrantes con los paisajes circundantes a áreas seleccionadas aleatoriamente y que eran utilizadas por relativamente pocas aves durante la migración. Las áreas de concentración se caracterizaron por tener 1.3 veces más cobertura forestal y 9.3 veces más cobertura de agua que las áreas con poca actividad migratoria. Detectamos una fuerte relación negativa entre la actividad de las aves migratorias y los usos de suelo agrícolas. El examen de eventos migratorios individuales confirmó la importancia de los fragmentos de hábitat boscoso en paisajes muy alterados y resaltó las grandes concentraciones de aves partiendo de áreas terrestres cercanas a la costa en la cuenca de los Grandes Lagos. Concluimos que los esfuerzos de conservación pueden ser abordados más efectivamente mediante el análisis intensivo de imágenes de radar. [source]


    Surge Capacity for Health Care Systems: Early Detection, Methodologies, and Process

    ACADEMIC EMERGENCY MEDICINE, Issue 11 2006
    Peter L. Estacio PhD
    Excessive demand on hospital services from large-scale emergencies is something that every emergency department health care provider and hospital administrator knows could happen at any time. Nowhere in this country have we recently faced a disaster of the magnitude of concern we now face involving agents of mass destruction or social disruption, especially those in the area of infectious diseases and radiological materials. The war on terrorism is not a conventional war, and terrorists may use any means of convenience to carry out their objectives in an unpredictable time line. Have we adequately prepared for the potentially excessive surge in demand for medical services that a large-scale event could bring to our medical care system? Are our emergency departments ready for such events? Surveillance systems, such as BioWatch, BioSense, the National Biosurveillance Integration System, and the countermeasure program BioShield, offer hope that we will be able to meet these new challenges. [source]


    New Technology and Methodologies for Intraoperative, Perioperative, and Intraprocedural Monitoring of Surgical and Catheter Interventions for Congenital Heart Disease

    ECHOCARDIOGRAPHY, Issue 8 2002
    Mary J. Rice M.D.
    We review the new technology and methods available for support of intraoperative and intraprocedural imaging in the catheterization laboratory for surgical and interventional catheterization procedures in the treatment of congenital heart disease. The methods reviewed include miniaturized probes and new ways of using them perioperatively for cardiac imaging from transesophageal, substernal, and intracardiac imaging locations. The smaller and more versatile the probes, the better adapted they will be in providing methods to improve the outcomes in babies born with serious forms of congenital heart disease. [source]


    Sensitive Adsorptive Stripping Voltammetric Methodologies for the Determination of Melatonin in Biological Fluids

    ELECTROANALYSIS, Issue 9 2003
    L. Corujo-Antuña
    Abstract The reversible redox process that melatonin presented on carbon paste electrodes was the basis of a sensitive methodology for the determination of this hormone. From all the processes presented by this hormone, this was never used before as the basis of voltammetric measurements for melatonin determination. Therefore, parameters that affected the cyclic voltammetric signal were studied. A limit of detection as low as 9×10,11,M was obtained when optimized alternating current voltammetry was employed. The reproducibility was excellent due to an adequate pretreatment of the solid electrode (RSD=2.7%, n=10). A comparison with methodologies that employ different electrochemical techniques from the point of view of their analytical characteristics was made. This methodology has proved to be suitable for the determination of melatonin in biological fluids. [source]


    Cover Picture: Electrophoresis 20'2009

    ELECTROPHORESIS, Issue 20 2009
    Article first published online: 27 OCT 200
    Issue no. 20 is a regular issue with an Emphasis on "Fundamentals and Methodologies". The bulk of this issue (13 articles) is on fundamentals and methodologies covering various topics, e.g. EOF, affinity CE, structural analysis of glycosphingolipids by CE-ESI-MS, on-line concentration, monolithic columns, etc. The other 6 articles are on protein separation and proteomics. Selected articles are: Micropump based on electroosmosis of the second kind ((10.1002/elps.200900271)) A splicing model-based DNA computing approach on microfluidic chip ((10.1002/elps.200900323)) Proteomic Characterization of Plasma-derived Clotting Factor VIII , von Willebrand Factor Concentrates ((10.1002/elps.200900270)) [source]


    Cover Picture: Electrophoresis 17'09

    ELECTROPHORESIS, Issue 17 2009
    Article first published online: 26 AUG 200
    Issue no. 17 is an Emphasis Issue with 10 articles on various aspects of "Proteins and Proteomics" while the remaining 10 articles are arranged into 3 different parts including "Chip Technology", "Nucleic Acids", and "Methodologies, Assays and Applications." Selected articles are: Proteomic analysis of Oenococcus oeni freeze-dried culture to assess the importance of cell acclimation to conduct malolactic fermentation in wine ((10.1002/elps.200900228)) Complete sequencing and oxidative modification of Mn-SOD (SOD2) in medulloblastoma cells ((10.1002/elps.200900168)) Differentiation of Staphylococcus aureus strains by capillary electrophoresis, zeta potential and coagulase gene polymorphism ((10.1002/elps.200900186)) [source]


    Cover Picture: Electrophoresis 16'09

    ELECTROPHORESIS, Issue 16 2009
    Article first published online: 18 AUG 200
    Issue no. 16 is a special on "Enantioseparations". It consists of 19 research papers and 2 review articles distributed over 4 different parts. The two review articles make up Part I and focus on recent developments in microchip enantioseparations and chiral analysis of drugs, metabolites and biomarkers in biological samples. The 19 research papers are distributed over the remaining 3 parts including "Fundamentals and Methodologies", "Chiral Capillary Electrochromatography" and "Biomedical, Pharmaceutical, Food and Environmental Applications of Electromigration Techniques". Issue no. 16 also includes a Fast Track paper on the "Analysis of genetic variation in Globocephaloides populations from macropodid marsupials using a mutation scanning-based approach". [source]


    Cover Picture: Electrophoresis 11'09

    ELECTROPHORESIS, Issue 11 2009
    Article first published online: 10 JUN 200
    Issue no. 11 is a regular issue consisting of 22 research articles distributed over 5 parts including: (i) Proteins and Proteomics, (ii) EKC and CEC, (iii) Detection and Preconcentration Approaches, (iv) Enantioseparation and (v) Methodologies and Assays. Selected articles are: Quantifying Western blots: pitfalls of densitometry ((10.1002/elps.200800720)) Light emitting diode-induced chemiluminescence detection for capillary electrophoresis ((10.1002/elps.200800708)) Double sample preconcentration by in-line coupled large volume single drop microextraction and sweeping in capillary electrophoresis ((10.1002/elps.200800759)) [source]


    Cover Picture: Electrophoresis 9'09

    ELECTROPHORESIS, Issue 9 2009
    Article first published online: 7 MAY 200
    Issue no. 9 is an Emphasis Issue with 7 articles on various aspects of "Microfluidics and Miniaturization" while the remaining articles are grouped into sections on "Detection Sensitivity Enhancement and Stacking", "Binding Studies" and "Other Methodologies". In addition, issue no. 9 has two Fast Track articles. The first on proteome alteration of early-stage differentiation of mouse embryonic stem cells into hepatocyte-like cells, and the second on dielectrophoretic separation of small particles in a sawtooth channel. [source]


    Cover Picture: Electrophoresis 8'09

    ELECTROPHORESIS, Issue 8 2009
    Article first published online: 20 APR 200
    Issue no. 8 is an Emphasis Issue with 12 articles in Part I on various aspects of "Fundamentals and Methodologies" while the remaining 5 articles are grouped into Part II on "Bioanalysis". In addition, this issue includes a Fast Track article that demonstrates that "omic analyses unravels molecular changes in the brain and liver of a rat model for chronic Sake (Japanese alcoholic beverage) intake". Further selected topics from issue 8 are: Introducing a new parameter for quality control of proteome profiles: Consideration of commonly expressed proteins. (10.1002/elps.200800440) Improving sensitivity in micro-free flow electrophoresis using signal averaging. (10.1002/elps.200800497) [source]


    Cover Picture: Electrophoresis 19'2008

    ELECTROPHORESIS, Issue 19 2008
    Article first published online: 28 OCT 200
    This issue has a dual emphasis: "APCE 2007" and "Fundamentals and Methodologies" with the aim of providing the readers of the Journal with the latest developments in the field. APCE 2007, which was held in Singapore, December 17 , 19, 2007, is the premier forum in the Asia-Pacific countries for communicating advances in capillary- and chip-based electroseparation techniques and their applications to genomics, proteomics, and chemical and biochemical analysis. The emphasis part on APCE 2007 is a "mini proceeding" that groups 7 representative research articles, which deal with microchip electrophoresis, restriction fragment length polymorphism analysis by CE, gradient MEKC, stacking and sweeping in CE, in-line pre-concentration in CE, and food and drug analysis by CE. In addition, issue 19 has a "Fast Track" article on the principles for different modes of multiple-injection CZE and provides equations that facilitate the transfer from single-injection CZE to one or more suitable modes of multiple injection CZE. [source]


    Cover Picture: Electrophoresis 13/2008

    ELECTROPHORESIS, Issue 13 2008
    Article first published online: 11 JUL 200
    Issue 13 is a regular issue including an Emphasis Section offering a series of 10 papers on "Fundamentals and Methodologies". These papers are related to peptide isoelectric point calculation, DNA separation by MEKC, modeling mobility of apothioneins, protein expression in osteoarthritis, capillary coating, preparative separation of proteins by dynamic field gradient focusing, determination of pKa, speciation analysis by CE-inductively coupled plasma MS, etc. [source]