Default

Distribution by Scientific Domains
Distribution within Business, Economics, Finance and Accounting

Kinds of Default

  • mortgage default

  • Terms modified by Default

  • default cost
  • default logic
  • default option
  • default probability
  • default rate
  • default risk
  • default swap
  • default value

  • Selected Abstracts


    A Quantitative Theory of Unsecured Consumer Credit with Risk of Default

    ECONOMETRICA, Issue 6 2007
    Satyajit Chatterjee
    We study, theoretically and quantitatively, the general equilibrium of an economy in which households smooth consumption by means of both a riskless asset and unsecured loans with the option to default. The default option resembles a bankruptcy filing under Chapter 7 of the U.S. Bankruptcy Code. Competitive financial intermediaries offer a menu of loan sizes and interest rates wherein each loan makes zero profits. We prove the existence of a steady-state equilibrium and characterize the circumstances under which a household defaults on its loans. We show that our model accounts for the main statistics regarding bankruptcy and unsecured credit while matching key macroeconomic aggregates, and the earnings and wealth distributions. We use this model to address the implications of a recent policy change that introduces a form of "means testing" for households contemplating a Chapter 7 bankruptcy filing. We find that this policy change yields large welfare gains. [source]


    Default and Punishment in General Equilibrium,

    ECONOMETRICA, Issue 1 2005
    Pradeep Dubey
    We extend the standard model of general equilibrium with incomplete markets to allow for default and punishment by thinking of assets as pools. The equilibrating variables include expected delivery rates, along with the usual prices of assets and commodities. By reinterpreting the variables, our model encompasses a broad range of adverse selection and signalling phenomena in a perfectly competitive, general equilibrium framework. Perfect competition eliminates the need for lenders to compute how the size of their loan or the price they quote might affect default rates. It also makes for a simple equilibrium refinement, which we propose in order to rule out irrational pessimism about deliveries of untraded assets. We show that refined equilibrium always exists in our model, and that default, in conjunction with refinement, opens the door to a theory of endogenous assets. The market chooses the promises, default penalties, and quantity constraints of actively traded assets. [source]


    Efficiency, Equilibrium, and Asset Pricing with Risk of Default

    ECONOMETRICA, Issue 4 2000
    Fernando Alvarez
    We introduce a new equilibrium concept and study its efficiency and asset pricing implications for the environment analyzed by Kehoe and Levine (1993) and Kocherlakota (1996). Our equilibrium concept has complete markets and endogenous solvency constraints. These solvency constraints prevent default at the cost of reducing risk sharing. We show versions of the welfare theorems. We characterize the preferences and endowments that lead to equilibria with incomplete risk sharing. We compare the resulting pricing kernel with the one for economies without participation constraints: interest rates are lower and risk premia depend on the covariance of the idiosyncratic and aggregate shocks. Additionally, we show that asset prices depend only on the valuation of agents with substantial idiosyncratic risk. [source]


    Bank Mergers, Information, Default and the Price of Credit

    ECONOMIC NOTES, Issue 1 2006
    Margarida Catalão- Lopes
    This paper addresses the impact of bank mergers on the price of firm credit, through an information channel. It is shown that, as bank mergers imply a wider spreading of information among banks concerning firms' past defaults, they may increase the expected revenue from lending. Therefore, interest rates may decline as long as a sufficiently competitive environment is preserved. A fall in interest rates, in turn, reduces the incentives for firms to strategically default, which reinforces the downward effect on the price of credit. The results are a function of the level of information sharing and of the sensitivity of the default probability to the interest rate. [source]


    Waiving Technical Default: The Role of Agency Costs and Bank Regulations

    JOURNAL OF BUSINESS FINANCE & ACCOUNTING, Issue 9-10 2006
    Hassan R. HassabElnaby
    Abstract:, This paper examines whether the characteristics of banks and borrowers are associated with banks' decisions to waive violations of debt covenants. The findings suggest that banks possess sufficient private information about firms, and they use this information in their waiver decisions. Banks' decisions to waive violations vary with the borrowers' agency costs, debt features, the banks' characteristics and regulatory circumstances, and the bank-firm business relationship. There is no evidence that syndicated loans, bank structure, and adverse economic conditions are significant determinants of the waiver decision. Research findings offer valuable insight into the theoretical and practical implications of debt covenants and agency costs. [source]


    Contemporaneous Loan Stress and Termination Risk in the CMBS Pool: How "Ruthless" is Default?

    REAL ESTATE ECONOMICS, Issue 2 2010
    Tracey Seslen
    This study analyzes the impact of contemporaneous loan stress on the termination of loans in the commercial mortgage-backed securities pool from 1992 to 2004 using a novel measure, based on changes in net operating incomes and property values at the metropolitan statistical area-property-type-year level. Employing a semi-parametric competing risks model for a variety of specifications, we find that the probability of default is extremely low even at very high levels of stress, although the coefficient estimates of greatest interest are very statistically significant. These results suggest substantial lender forbearance and are consistent with previous research that models default as a "gradual dynamic process" rather than a "ruthless" exercise once "in the money." [source]


    Determinants of Multifamily Mortgage Default

    REAL ESTATE ECONOMICS, Issue 3 2002
    Wayne R. Archer
    Option,based models of mortgage default posit that the central measure of default risk is the loan,to,value (LTV) ratio. We argue, however, that an unrecognized problem with extending the basic option model to existing multifamily and commercial mortgages is that key variables in the option model are endogenous to the loan origination and property sale process. This endogeneity implies, among other things, that no empirical relationship may be observed between default and LTV. Since lenders may require lower LTVs in order to mitigate risk, mortgages with low and moderate LTVs may be as likely to default as those with high LTVs. Mindful of this risk endogeneity and its empirical implications, we examine the default experience of 495 fixed,rate multifamily mortgage loans securitized by the Resolution Trust Corporation (RTC) and the Federal Deposit Insurance Corporation (FDIC) during the period 1991,1996. The extensive nature of the data supports multivariate analysis of default incidence in a number of respects not possible in previous studies. Consistent with our expectations, we find that LTV evidences no relationship to default incidence, while the strongest predictors of default are property characteristics, including three,digit ZIP code location and initial cash flow as reflected in the debt coverage ratio. The latter results are particularly interesting in that they dominated the influence of postorigination changes in the local economy. [source]


    Mortgage Lending, Sample Selection and Default

    REAL ESTATE ECONOMICS, Issue 4 2000
    Stephen L Ross
    Traditional models of mortgage default suffer from sample-selection bias because they do not control for the loan approval process. This paper estimates a sample-selection-corrected default model using the 1990 Boston Federal Reserve loan application sample and the 1992 Federal Housing Authority (FHA) foreclosure sample. A single-equation FHA default model appears to suffer from substantial selection bias, but the bias primarily arises from the omission of credit history and other variables that are only in the application sample. Therefore, default models that contain detailed information on applicants may not suffer from substantial selection bias. Finally, a test for prejudice-based discrimination is developed and conducted, but the findings are inconclusive. [source]


    Income, Location and Default: Some Implications for Community Lending

    REAL ESTATE ECONOMICS, Issue 3 2000
    Robert Van Order
    This paper investigates differences in default losses across income groups and neighborhoods, in an effort to see if there are significant differences between default experience on loans to low-income households or low-income neighborhoods and other loans. We find that while defaults and losses are somewhat higher in low-income neighborhoods, default behavior is similar in the sense that responses to negative equity are similar across neighborhoods, and remaining differences are small and might be explained by omitted variables such as those measuring credit history. [source]


    Demographic variables routinely collected at colposcopic examination do not predict who will default from conservative management of cervical intraepithelial neoplasia I

    AUSTRALIAN AND NEW ZEALAND JOURNAL OF OBSTETRICS AND GYNAECOLOGY, Issue 1 2005
    Julie A. QUINLIVAN
    Abstract Objective:, As a result of the low incidence of progression from low grade epithelial abnormalities to cervical intraepithelial neoplasia (CIN) 3 or cervical cancer, a conservative approach to management is supported, especially in young women. Loss to follow-up is a recognised problem with a conservative approach however, with women defaulting known to experience higher rates of cancer. Aim:, To determine if any routinely collected demographic variables could predict which Australian women would subsequently default from care having initially elected to have conservative management of CIN 1 lesions. Methods:, Prospectively collected data was audited on 279 women with a colposcopically directed biopsy diagnosis of CIN 1, confirmed on external review, who were enroled by their own choice into a conservative management program and monitored until a definitive lesion outcome was determined. Women who defaulted from follow-up and were lost to care providers despite follow-up appointments and reminder letters were compared to women who completed follow-up with either lesion resolution or progression requiring treatment, to establish if there were any demographic variables to predict default from care. Results:, Fifty-two (18.5%) women subsequently defaulted from follow-up. There were no significant differences in age, parity, proportion of women who were pregnant at diagnosis, smoking status, immunosuppressed or had a ,human papillomavirus (HPV) effect' reported on Pap-smear or colposcopic examination. Conclusion:, We cannot easily identify a subgroup of women who are more likely to default from follow-up of CIN 1 using routinely collected demographic data. Default from follow-up is a major risk with conservative approaches and further research to reduce default rates are required. [source]


    Country Default Risk: An Empirical Assessment

    AUSTRALIAN ECONOMIC PAPERS, Issue 4 2001
    Jerome L. Stein
    We provide benchmarks to evaluate what is an optimal foreign debt and a maximal foreign debt (debt-max), when risk is explicitly considered. When the actual debt exceeds debt-max, then the economy will default when a ,bad shock' occurs. This paper is an application of the stochastic optimal controls models of Fleming and Stein (2001), which gives empirical content to the question of how one should measure ,vulnerability' to shocks, when there is uncertainty concerning the productivity of capital. We consider two sets of high-risk countries during the period 1978,99: a subset of 21 countries that defaulted on the debt, and another set of 13 countries that did not default. Default is a situation where the firms or government of a country reschedule the interest/principal payments on the external debt. We thereby explain how our analysis can anticipate default risk, and add another dimension to the literature of early warning signals of default/credit risk. [source]


    Can default rates in colposcopy really be reduced?

    BJOG : AN INTERNATIONAL JOURNAL OF OBSTETRICS & GYNAECOLOGY, Issue 3 2008
    L Balasubramani
    A prospective postal questionnaire study aimed to identify variables that predict a woman's intention to attend and her subsequent attendance/default at colposcopy clinics. One thousand two hundred and fifty-eight women attending colposcopy clinics of a university hospital were sent a postal questionnaire 3 weeks before their second appointment at colposcopy. An intention to attend the colposcopy clinic was the most significant predictor for colposcopy attendance during the next 15 months. Smoking and a longer travel time were associated with default. Our study shows that while interventions tried by service providers can reduce default rates, there will remain a cohort of women who do not fully participate in the screening programme. [source]


    Variation in food availability influences prey-capture method in antlion larvae

    ECOLOGICAL ENTOMOLOGY, Issue 5 2008
    EFRAT ELIMELECH
    Abstract 1.,Larvae of a Myrmecaelurus sp. are unique among antlions because they have two prey-capture methods; they either ambush prey at the surface, or dig pit traps that prey fall in to. It was hypothesised that larvae will use the capture method that maximises their net rate of energy gain, which will be influenced by food availability (encounter rate) and by past energy inputs (body condition). 2.,Costs were estimated by measuring resting and activity metabolic rates and determining the duration of pit maintenance at various encounter rates with ants that served as prey. Benefits were estimated from the energy gained per ant captured at different encounter rates. 3.,Net energy gained was higher with a pit than without one, and was influenced more by the differences in prey capture rate between the two capture methods, and less by the differences in energy costs associated with each method. The proportion of larvae that constructed pits was higher when they were in intermediate body condition than when in good or in poor body condition. 4.,Thus, the use of one capture method or the other depends on a combination of the influences of past net energy gain and the antlion's most recent change in encounter rate with prey. Ambushing without a pit may serve as a default when physiological constraints limit the larvae's ability to invest in pit construction and maintenance, or when larvae are sated, and saving the energy of pit construction and maintenance is worthwhile. [source]


    Default and Punishment in General Equilibrium,

    ECONOMETRICA, Issue 1 2005
    Pradeep Dubey
    We extend the standard model of general equilibrium with incomplete markets to allow for default and punishment by thinking of assets as pools. The equilibrating variables include expected delivery rates, along with the usual prices of assets and commodities. By reinterpreting the variables, our model encompasses a broad range of adverse selection and signalling phenomena in a perfectly competitive, general equilibrium framework. Perfect competition eliminates the need for lenders to compute how the size of their loan or the price they quote might affect default rates. It also makes for a simple equilibrium refinement, which we propose in order to rule out irrational pessimism about deliveries of untraded assets. We show that refined equilibrium always exists in our model, and that default, in conjunction with refinement, opens the door to a theory of endogenous assets. The market chooses the promises, default penalties, and quantity constraints of actively traded assets. [source]


    A General Formula for Valuing Defaultable Securities

    ECONOMETRICA, Issue 5 2004
    P. Collin-Dufresne
    Previous research has shown that under a suitable no-jump condition, the price of a defaultable security is equal to its risk-neutral expected discounted cash flows if a modified discount rate is introduced to account for the possibility of default. Below, we generalize this result by demonstrating that one can always value defaultable claims using expected risk-adjusted discounting provided that the expectation is taken under a slightly modified probability measure. This new probability measure puts zero probability on paths where default occurs prior to the maturity, and is thus only absolutely continuous with respect to the risk-neutral probability measure. After establishing the general result and discussing its relation with the existing literature, we investigate several examples for which the no-jump condition fails. Each example illustrates the power of our general formula by providing simple analytic solutions for the prices of defaultable securities. [source]


    Transform Analysis and Asset Pricing for Affine Jump-diffusions

    ECONOMETRICA, Issue 6 2000
    Darrell Duffie
    In the setting of ,affine' jump-diffusion state processes, this paper provides an analytical treatment of a class of transforms, including various Laplace and Fourier transforms as special cases, that allow an analytical treatment of a range of valuation and econometric problems. Example applications include fixed-income pricing models, with a role for intensity-based models of default, as well as a wide range of option-pricing applications. An illustrative example examines the implications of stochastic volatility and jumps for option valuation. This example highlights the impact on option ,smirks' of the joint distribution of jumps in volatility and jumps in the underlying asset price, through both jump amplitude as well as jump timing. [source]


    Efficiency, Equilibrium, and Asset Pricing with Risk of Default

    ECONOMETRICA, Issue 4 2000
    Fernando Alvarez
    We introduce a new equilibrium concept and study its efficiency and asset pricing implications for the environment analyzed by Kehoe and Levine (1993) and Kocherlakota (1996). Our equilibrium concept has complete markets and endogenous solvency constraints. These solvency constraints prevent default at the cost of reducing risk sharing. We show versions of the welfare theorems. We characterize the preferences and endowments that lead to equilibria with incomplete risk sharing. We compare the resulting pricing kernel with the one for economies without participation constraints: interest rates are lower and risk premia depend on the covariance of the idiosyncratic and aggregate shocks. Additionally, we show that asset prices depend only on the valuation of agents with substantial idiosyncratic risk. [source]


    Mortgage Terminations, Heterogeneity and the Exercise of Mortgage Options

    ECONOMETRICA, Issue 2 2000
    Yongheng Deng
    As applied to the behavior of homeowners with mortgages, option theory predicts that mortgage prepayment or default will be exercised if the call or put option is ,in the money' by some specific amount. Our analysis: tests the extent to which the option approach can explain default and prepayment behavior; evaluates the practical importance of modeling both options simultaneously; and models the unobserved heterogeneity of borrowers in the home mortgage market. The paper presents a unified model of the competing risks of mortgage termination by prepayment and default, considering the two hazards as dependent competing risks that are estimated jointly. It also accounts for the unobserved heterogeneity among borrowers, and estimates the unobserved heterogeneity simultaneously with the parameters and baseline hazards associated with prepayment and default functions. Our results show that the option model, in its most straightforward version, does a good job of explaining default and prepayment, but it is not enough by itself. The simultaneity of the options is very important empirically in explaining behavior. The results also show that there exists significant heterogeneity among mortgage borrowers. Ignoring this heterogeneity results in serious errors in estimating the prepayment behavior of homeowners. [source]


    Bank Mergers, Information, Default and the Price of Credit

    ECONOMIC NOTES, Issue 1 2006
    Margarida Catalão- Lopes
    This paper addresses the impact of bank mergers on the price of firm credit, through an information channel. It is shown that, as bank mergers imply a wider spreading of information among banks concerning firms' past defaults, they may increase the expected revenue from lending. Therefore, interest rates may decline as long as a sufficiently competitive environment is preserved. A fall in interest rates, in turn, reduces the incentives for firms to strategically default, which reinforces the downward effect on the price of credit. The results are a function of the level of information sharing and of the sensitivity of the default probability to the interest rate. [source]


    Advancing Loss Given Default Prediction Models: How the Quiet Have Quickened

    ECONOMIC NOTES, Issue 2 2005
    Greg M. Gupton
    We describe LossCalcÔ version 2.0: the Moody's KMV model to predict loss given default (LGD), the equivalent of (1 , recovery rate). LossCalc is a statistical model that applies multiple predictive factors at different information levels: collateral, instrument, firm, industry, country and the macroeconomy to predict LGD. We find that distance-to-default measures (from the Moody's KMV structural model of default likelihood) compiled at both the industry and firm levels are predictive of LGD. We find that recovery rates worldwide are predictable within a common statistical framework, which suggests that the estimation of economic firm value (which is then available to allocate to claimants according to each country's bankruptcy laws) is a dominant step in LGD determination. LossCalc is built on a global dataset of 3,026 recovery observations for loans, bonds and preferred stock from 1981 to 2004. This dataset includes 1,424 defaults of both public and private firms , both rated and unrated instruments , in all industries. We demonstrate out-of-sample and out-of-time LGD model validation. The model significantly improves on the use of historical recovery averages to predict LGD. [source]


    Measuring and Optimizing Portfolio Credit Risk: A Copula-based Approach,

    ECONOMIC NOTES, Issue 3 2004
    Annalisa Di Clemente
    In this work, we present a methodology for measuring and optimizing the credit risk of a loan portfolio taking into account the non-normality of the credit loss distribution. In particular, we aim at modelling accurately joint default events for credit assets. In order to achieve this goal, we build the loss distribution of the loan portfolio by Monte Carlo simulation. The times until default of each obligor in portfolio are simulated following a copula-based approach. In particular, we study four different types of dependence structure for the credit assets in portfolio: the Gaussian copula, the Student's t-copula, the grouped t-copula and the Clayton n-copula (or Cook,Johnson copula). Our aim is to assess the impact of each type of copula on the value of different portfolio risk measures, such as expected loss, maximum loss, credit value at risk and expected shortfall. In addition, we want to verify whether and how the optimal portfolio composition may change utilizing various types of copula for describing the default dependence structure. In order to optimize portfolio credit risk, we minimize the conditional value at risk, a risk measure both relevant and tractable, by solving a simple linear programming problem subject to the traditional constraints of balance, portfolio expected return and trading. The outcomes, in terms of optimal portfolio compositions, obtained assuming different default dependence structures are compared with each other. The solution of the risk minimization problem may suggest us how to restructure the inefficient loan portfolios in order to obtain their best risk/return profile. In the absence of a developed secondary market for loans, we may follow the investment strategies indicated by the solution vector by utilizing credit default swaps. [source]


    Pricing Loans Using Default Probabilities

    ECONOMIC NOTES, Issue 2 2003
    Stuart M. Turnbull
    This paper examines the pricing of loans using the term structure of the probability of default over the life of the loan. We describe two methodologies for pricing loans. The first methodology uses the term structure of credit spreads to price a loan, after adjusting for the difference in recovery rates between bonds and loans. In loan origination, it is common practice to estimate the probability of default for a loan over a specified time horizon and the loss given default. The second methodology shows how to incorporate this information into the arbitrage free pricing of a loan. We also show how to derive an estimate of the credit spread due to liquidity risk. For both methodologies, we show how to calculate a break,even credit spread, taking into account the fee structure of a loan and the costs associated with the term structure of marginal economic capital. The break,even spread is the minimum spread for the loan to be EVA neutral in a multi,period setting. (J.E.L.: G12, G33). [source]


    Is Greece heading for default?

    ECONOMIC OUTLOOK, Issue 2 2010
    Article first published online: 4 MAY 2010
    First page of article [source]


    Incorporating Collateral Value Uncertainty in Loss Given Default Estimates and Loan-to-value Ratios

    EUROPEAN FINANCIAL MANAGEMENT, Issue 3 2003
    Esa Jokivuolle
    Abstract We present a model of risky debt in which collateral value is correlated with the possibility of default. The model is then used to study the expected loss given default, primarily as a function of collateral. The results obtained could prove useful for estimating losses given default in many popular models of credit risk which assume them constant. We also examine the problem of determining sufficient collateral to secure a loan to a desired extent. In addition to bank practitioners, regulators might find our analysis useful in reviewing banks' lending standards relative to current collateral values. In particular, the current proposals for The New (Basel) Capital Accord involve options for the use of banks' own loss given default estimates which might benefit from the analysis in this paper. [source]


    Real Estate Brokerage, Homebuyer Training, and Homeownership Sustainability for Housing Assistance Programs

    FAMILY & CONSUMER SCIENCES RESEARCH JOURNAL, Issue 4 2009
    Wayne Archer
    This study examines a previously overlooked factor in the rate of default on home loans by marginal first-time homebuyers; namely, the purchase transaction process. In particular, the study examines the potential for the type of initial contact in a homebuyer assistance program to affect the likelihood of default on a subsequent home loan. Using data from 41 state funded local assistance programs in Florida, the study is able to examine the relationship of program default rates to the source of applicant for assistance. Specifically, it examines the explanatory capacity of the percentage of applicants who had a contract to purchase prior to applying for assistance, indicated that the applicant already has engaged with a broker or lender. It finds that the percentage of applicants for assistance who already have engaged with a broker or lender is very significantly and positively relate to the program default rate. [source]


    Capital Allocation and Risk Performance Measurement In a Financial Institution

    FINANCIAL MARKETS, INSTITUTIONS & INSTRUMENTS, Issue 5 2000
    Stuart M. Turnbull
    This paper provides an analytical and practical framework, consistent with maximizing the wealth of existing shareholders, to address the following questions: What are the costs associated with economic capital? What is the tradeoff between the probability of default and the costs of economic capital? How do we take into account the time profile of economic capital when assessing the performance of a business? What is the appropriate measure of profitability, keeping the probability of default constant? It is shown that the capital budgeting decision depends not only on the covariance of the return of a project with the market portfolio, but also on the covariance with the bank's existing assets. This dependency arises from the simple fact that the economic capital is not additive. [source]


    PY181 Pigment Microspheres of Nanoplates Synthesized via Polymer-Induced Liquid Precursors

    ADVANCED FUNCTIONAL MATERIALS, Issue 13 2009
    Yurong Ma
    Abstract Organic pigments are important crystalline substances, and their properties and applications rely on size and shape control. Pigment Yellow 181 (PY181) is an industrial azo pigment that is light and weatherfast and suitable for high temperature processing. One disadvantage is its needle-like shape in the default , -phase, which makes the pigment difficult to process in industry, e.g., in polymer melts, where a spherical structure would be ideal. Here, we show for the first time, that polymer-induced liquid precursor structures can be formed even in association to a chemical reaction. Furthermore, it is demonstrated that biomineralization principles can be exploited for the generation of advanced functional materials, such as pigments with novel complex morphology and different properties. Stable PY181 microspheres of nanoplates in the , -phase were obtained in mixed solvents of water and isopropanol by direct azo coupling under the directing influence of a designed copolymer additive aminobenzoylaminobenzamide-acetoacetyl-poly(ethylene imine)- block -poly(ethylene glycol) (ABABA-acetoacetyl-PEI- b -PEG). [source]


    Capital Assistance for Small Firms: Some Implications for Regional Economic Welfare

    GEOGRAPHICAL ANALYSIS, Issue 1 2000
    Daniel Felsenstein
    This paper analyzes the role of finance capital in regional economic development. A cost-benefit approach is invoked in order to estimate the welfare impacts of a regional loan and guarantee program for small firms in Israel. Program-created employment is treated as a benefit and an employment account that separates net from gross employment, is presented. An estimate of net wage benefits is then derived. This involves adjusting wages across different earnings classes in order to account for the variation in opportunity costs of labor at different levels. The estimation of costs includes the opportunity costs of capital, administration, default, and tax-raising costs. Results point to substantial regional welfare effects. We stress the need to account for changing regional economic structure in this kind of evaluation framework. [source]


    Non-uniqueness with refraction inversion , the Mt Bulga shear zone

    GEOPHYSICAL PROSPECTING, Issue 4 2010
    Derecke Palmer
    ABSTRACT The tau-p inversion algorithm is widely employed to generate starting models with many computer programs that implement refraction tomography. However, this algorithm can frequently fail to detect even major lateral variations in seismic velocities, such as a 50 m wide shear zone, which is the subject of this study. By contrast, the shear zone is successfully defined with the inversion algorithms of the generalized reciprocal method. The shear zone is confirmed with a 2D analysis of the head wave amplitudes, a spectral analysis of the refraction convolution section and with numerous closely spaced orthogonal seismic profiles recorded for a later 3D refraction investigation. Further improvements in resolution, which facilitate the recognition of additional zones with moderate reductions in seismic velocity, are achieved with a novel application of the Hilbert transform to the refractor velocity analysis algorithm. However, the improved resolution also requires the use of a lower average vertical seismic velocity, which accommodates a velocity reversal in the weathering. The lower seismic velocity is derived with the generalized reciprocal method, whereas most refraction tomography programs assume vertical velocity gradients as the default. Although all of the tomograms are consistent with the traveltime data, the resolution of each tomogram is comparable only with that of the starting model. Therefore, it is essential to employ inversion algorithms that can generate detailed starting models, where detailed lateral resolution is the objective. Non-uniqueness can often be readily resolved with head wave amplitudes, attribute processing of the refraction convolution section and additional seismic traverses, prior to the acquisition of any borehole data. It is concluded that, unless specific measures are taken to address non-uniqueness, the production of a single refraction tomogram that fits the traveltime data to sufficient accuracy does not necessarily demonstrate that the result is either correct, or even the most probable. [source]


    Negative BOLD responses to epileptic spikes

    HUMAN BRAIN MAPPING, Issue 6 2006
    Eliane Kobayashi
    Abstract Simultaneous electroencephalogram/functional magnetic resonance imaging (EEG-fMRI) during interictal epileptiform discharges can result in positive (activation) and negative (deactivation) changes in the blood oxygenation level-dependent (BOLD) signal. Activation probably reflects increased neuronal activity and energy demand, but deactivation is more difficult to explain. Our objective was to evaluate the occurrence and significance of deactivations related to epileptiform discharges in epilepsy. We reviewed all EEG-fMRI studies from our database, identified those with robust responses (P = 0.01, with ,5 contiguous voxels with a |t| > 3.1, including ,1 voxel at |t| > 5.0), and divided them into three groups: activation (A = 8), deactivation (D = 9), and both responses (AD = 43). We correlated responses with discharge type and location and evaluated their spatial relationship with regions involved in the "default" brain state (Raichle et al. [2001]: Proc Natl Acad Sci 98:676,682]. Deactivations were seen in 52/60 studies (AD+D): 26 related to focal discharges, 12 bilateral, and 14 generalized. Deactivations were usually distant from anatomical areas related to the discharges and more frequently related to polyspike- and spike-and-slow waves than to spikes. The "default" pattern occurred in 10/43 AD studies, often associated with bursts of generalized discharges. In conclusion, deactivations are frequent, mostly with concomitant activation, for focal and generalized discharges. Discharges followed by a slow wave are more likely to result in deactivation, suggesting neuronal inhibition as the underlying phenomenon. Involvement of the "default" areas, related to bursts of generalized discharges, provides evidence of a subclinical effect of the discharges, temporarily suspending normal brain function in the resting state. Hum Brain Mapp, 2005. © 2005 Wiley-Liss, Inc. [source]