Thursday, November 28, 2019

Albinism Essays - Skin Pigmentation, Albinism,

Albinism Albinism is a term used to describe people and animals that have little or no pigment in their eyes, skin, or hair. People with this condition have inherited genes that do not produce normal amounts of a pigment called melanin. It is equally common to all races and consists of two major classes. The first, Oculocutaneous albinism includes eyes, skin, and hair. Ocular, the second, involves mainly the eye. The oculocutaneous variety can be divided into 10 different types, the most common being ?ty-negative? and ?ty-positive.? Ty-negative leaves the person with no melanin pigmentation, hampers vision to a much more severe degree then ty-positive, and is caused by a genetic defect in the enzyme called tyrosinase. People with ty-positive will have very slight pigmentation and fewer vision problems. Ocular albinism may give the bearer slightly light hair and skin color, compared with the rest of their family, as well as the more obvious affects to their eye. The pigment loss may allow for involuntary back-and-forth movement of the eyes, crossed eyes, and sensitivity to bright light. Nerves going from the brain to the eye are not routed properly and have more never fibers crossing to the opposite side of the brain than normal. Both types of albinism are passed from parent to child and almost always require that both parents carry an albinism gene. This is referred to as autosomal recessive inheritance and the parents may have normal pigmentation, yet carry the gene and have a baby with albinism. A new test can now identify carriers of the gene for ty-negative and any other types where the tyrosinase enzyme doesn't function. A blood sample is used to determine if the gene is present by reading the DNA. X-linked inheritance differs from autosomal recessive inheritance, because only the mother carries the gene. The albinism gene is passed on the X chromosome from the mother to almost always her son. It can be recognized by ophthalmologist because of subtle eye changes. Albinism is unselective in race. Caucasians and non-Caucasians share this gene defect equally. One in 17,000 people have some type of albinism. In the autosomal recessive inheritance, if both parents carry the gene-yet neither have albinism, there is a one in four chance that the baby from each pregnancy will be born with albinism. Treatment of albinism consists primarily of ?visual rehabilitation.' Surgery can be used to correct crossed eyes, but does not correct problem with the routing of nerves, so does not give binocular vision. Sensitivity to bright light can be combated with tints or sunglasses. Some specific optical aids, such as bifocals and magnifiers, are also very helpful to this condition. The affects of this disease are not reversible, however because it is a part of their genetic makeup and can only be help with some of these types of aids. Albinism is a very misunderstood condition and because of this children can have a tough childhood. They are prone to isolation due to the misunderstandings. People question there parenthood, possibly thinking that it is a mixed marriage and outcast them. They may face criticism and ridicule in the classroom. Other students will not be able to understand why they appear this way and deal with it the best way they know how, laughing, smirking, giggling, etc. Children with albinism may need special emotional support from both their parents and teachers. They should be included in all group activities as well, so they don't stand out.

Sunday, November 24, 2019

The 19th Century Bone Wars

The 19th Century Bone Wars When most people think of the Wild West, they picture Buffalo Bill, Jesse James, and caravans of settlers in covered wagons. But for paleontologists, the American west in the late 19th century conjures up one image above all: the enduring rivalry between two of this countrys greatest fossil hunters, Othniel C. Marsh and Edward Drinker Cope. The Bone Wars, as their feud became known, stretched from the 1870s well into the 1890s, and resulted in hundreds of new dinosaur findsnot to mention reams of bribery, trickery, and outright theft, as well get to later. (Knowing a good subject when it sees one, HBO recently announced plans for a movie version of the Bone Wars starring James Gandolfini and Steve Carell; sadly, Gandolfinis sudden death has put the project in limbo.) In the beginning, Marsh and Cope were cordial, if somewhat wary, colleagues, having met in Germany in 1864 (at the time, western Europe, not the United States, was at the forefront of paleontology research). Part of the trouble stemmed from their different backgrounds: Cope was born into a wealthy Quaker family in Pennsylvania, while Marshs family in upstate New York was comparatively poor (albeit with a very rich uncle, who enters the story later). Its probable that, even then, Marsh considered Cope a bit of a dilettante, not really serious about paleontology, while Cope saw Marsh as too rough and uncouth to be a true scientist. The Fateful Elasmosaurus Most historians trace the start of the Bone Wars to 1868, when Cope reconstructed a strange fossil sent to him from Kansas by a military doctor. Naming the specimen Elasmosaurus, he placed its skull on the end of its short tail, rather than its long neck (to be fair to Cope, to that date no had ever seen an aquatic reptile with such out-of-whack proportions). When he discovered this error, Marsh (as the legend goes) humiliated Cope by pointing it out in public, at which point Cope tried to buy (and destroy) every copy of the scientific journal in which he had published his incorrect reconstruction. This makes for a good storyand the fracas over Elasmosaurus certainly contributed to the enmity between the two menbut the Bone Wars likely started on a more serious note. Cope had discovered the fossil site in New Jersey that yielded the fossil of Hadrosaurus, named by the two mens mentor, the famous paleontologist Joseph Leidy. When he saw how many bones had yet to be recovered from the site, Marsh paid the excavators to send any interesting finds to him, rather than to Cope. Cope soon found out about this gross violation of scientific decorum, and the Bone Wars began in earnest. Into the West What kicked the Bone Wars into high gear was the discovery, in the 1870s, of numerous dinosaur fossils in the American west (some of these finds were made accidentally, during excavation work for the Transcontinental Railroad). In 1877, Marsh received a letter from Colorado schoolteacher Arthur Lakes, describing the saurian bones he had found during a hiking expedition; Lakes sent sample fossils to both Marsh and (because he didn’t know if Marsh was interested) Cope. Characteristically, Marsh paid Lakes $100 to keep his discovery a secretand when he discovered that Cope had been notified, dispatched an agent west to secure his claim. Around the same time, Cope was tipped off to another fossil site in Colorado, which Marsh tried (unsuccessfully) to horn in on. By this time, it was common knowledge that Marsh and Cope were competing for the best dinosaur fossilswhich explains the subsequent intrigues centered on Como Bluff, Wyoming. Using pseudonyms, two workers for the Union Pacific Railroad alerted Marsh to their fossil finds, hinting (but not stating explicitly) that they might strike a deal with Cope if Marsh didnt offer generous terms. True to form, Marsh dispatched another agent, who made the necessary financial arrangementsand soon the Yale-based paleontologist was receiving boxcars of fossils, including the first specimens of Diplodocus, Allosaurus and Stegosaurus. Word about this exclusive arrangement soon spreadnot least because the Union Pacific employees leaked the scoop to a local newspaper, exaggerating the prices Marsh had paid for the fossils in order to bait the trap for the wealthier Cope. Soon, Cope sent his own agent westward, and when these negotiations proved unsuccessful (possibly because he wasnt willing to pony up enough money), he instructed his prospector to engage in a bit of fossil-rustling and steal bones from the Como Bluff site, right under Marshs nose. Soon afterward, fed up with Marshs erratic payments, one of the railroad men began working for Cope instead, turning Como Bluff into the epicenter of the Bone Wars. By this time, both Marsh and Cope had relocated westward, and over the next few years engaged in such hijinks as deliberately destroying uncollected fossils and fossil sites (so as to keep them out of each others hands), spying on each others excavations, bribing employees, and even stealing bones outright. According to one account, workers on the rival digs once took time out from their labors to pelt each other with stones! Next Page: The Bone Wars Get Personal Cope and Marsh, Bitter Enemies to the Last By the 1880’s, it was clear that Othniel C. Marsh was winning the Bone Wars. Thanks to the support of his wealthy uncle, George Peabody (who lent his name to the Yale Peabody Museum of Natural History), Marsh could hire more employees and open more dig sites, while Edward Drinker Cope slowly but surely fell behind. It didnt help matters that other parties, including a team from Harvard University, now joined the dinosaur gold rush. Cope continued to publish numerous papers, but, like a political candidate taking the low road, Marsh made hay out of every tiny mistake he could find. Cope soon had his opportunity for revenge. In 1884, Congress began an investigation into the U.S. Geological Survey, which Marsh had been appointed the head of a few years before. Cope recruited a number of Marshs employees to testify against their boss (who wasnt the easiest person in the world to work for), but Marsh connived to keep their grievances out of the newspapers. Cope then upped the ante: drawing on a journal he had kept for two decades, in which he meticulously listed Marshs numerous felonies, misdemeanors and scientific errors, he supplied the information to a journalist for the New York Herald, which ran a sensational series about the Bone Wars. Marsh issued a rebuttal in the same newspaper, hurling similar accusations against Cope. In the end, this public airing of dirty laundry (and dirty fossils) didnt benefit either party. Marsh was asked to resign his lucrative position at the Geological Survey, and Cope, after a brief interval of success (he was appointed head of the National Association for the Advancement of Science), was beset by poor health and had to sell off portions of his hard-won fossil collection. By the time Cope died in 1897, both men had squandered their considerable fortunes. Characteristically, though, Cope prolonged the Bone Wars even from his grave. One of his last requests was that scientists dissect his head after his death to determine the size of his brain, which he was certain would be bigger than Marshs. Wisely, perhaps, Marsh declined the challenge, and to this day, Copes unexamined head rests in storage at the University of Pennsylvania. The Bone Wars: Let History Judge As tawdry, undignified, and out-and-out ridiculous as the Bone Wars occasionally were, they had a profound effect on American paleontology. In the same way competition is good for commerce, it can also be good for science: so eager were Othniel C. Marsh and Edward Drinker Cope to one-up each other that they discovered many more dinosaurs than if theyd merely engaged in a friendly rivalry. The final tally was truly impressive: Marsh discovered 80 new dinosaur genera and species, while Cope named a more-than-respectable 56. The fossils discovered by Marsh and Cope also helped to feed the American publics increasing hunger for new dinosaurs. Each major discovery was accompanied by a wave of publicity, as magazines and newspapers illustrated the latest amazing findsand the reconstructed skeletons slowly but surely made their way to major museums, where they still reside to the present day. You might say that popular interest in dinosaurs really began with the Bone Wars, though its arguable that it would have come about naturally, without all the bad feelings! The Bone Wars had a couple of negative consequences, as well. First, paleontologists in Europe were horrified by the crude behavior of their American counterparts, which left a lingering, bitter distrust that took decades to dissipate. And second, Cope and Marsh described and reassembled their dinosaur finds so quickly that they were occasionally careless. For example, a hundred years of confusion about Apatosaurus and Brontosaurus can be traced directly back to Marsh, who put a skull on the wrong bodythe same way Cope did with Elasmosaurus, the incident that started the Bone Wars in the first place!

Thursday, November 21, 2019

Fluoridation and Toxicity Issues Assignment Example | Topics and Well Written Essays - 1500 words - 1

Fluoridation and Toxicity Issues - Assignment Example Fluoridation might actually result in the darkening of the teeth or dental fluorosis and may even affect the gums (The Debate over Adding Fluoride in Our Water, 2013). This will result in something like what American researchers called the Colorado Brown Stain, which was a result of excessive use of fluoride and which affected some children from 1909 to 1915. Moreover, the darkening of the teeth was not related to tooth decay (The Story of Fluoridation, 2011). In a study by Parnell et al. (2009), there have been 88 studies that revealed that fluorosis may be derived from drinking of water treated with fluoride. Fluoride consumption in drinking water may also be associated with problems concerning the health of the skeletal system. The most common is bone fracture (Limeback, 2000). The most common of these bone fracture types is hip fracture (Diesendorf et al., 1997). Moreover, data from 29 studies prove that long-term consumption of drinking water with fluoride can result in bone fracture (Parnell et al., 2009). Indeed, even though these studies are mostly from the United States, it does not change the fact that the potential harmful effects of fluoride can happen to any group of people in the world as long as they are exposed to relatively large amounts of the chemical in water. The third and perhaps most difficult concern, which I hope Dr. Nokes will bring up and clarify, is that an excess of fluoride in the human body is simply â€Å"detrimental to long-term dental and overall health† (The Debate over Adding Fluoride in Our Water, 2013). This is indeed very alarming because people are actually not familiar with the standard amount of fluoride that a human body must take in as well as the maximum levels of the chemical that the body can handle. Although the Environmental Protection Agency of the United States points out 4 mg/L as the standard maximum tolerable aount of fluoride that Americans can take in, the data may be

Wednesday, November 20, 2019

What Factors did Account for South Africas 1994 Transition to Essay

What Factors did Account for South Africas 1994 Transition to Democracy - Essay Example This period was associated with racial, social, political and economic segregation which led to apartheid. On February 2nd 1990, President FW Clerk released a speech that hinted to a decisive moment in South Africa’s struggle for democracy (Decalo 7-35)1. The day is highly regarded by many South Africans as it marked the commissioning of the release of Nelson Mandela (11th of February) and other detainees who had been arrested in the process of the struggle. This paved way for open negotiations. South Africa had been going through long struggles for democracy in a sub-society that chiefly consisted of whites at the helm of leadership and power and non-white sub-society with little or no influence in governance matters. Factors that led to the transition in South Africa can be classified as both internal and external. In his book, Coups and Army Rule in Africa: Motivations and Constraints, Samuel Decalo, argues that the transitions that led to democratization in South Africa we re majorly internal. The democratic changes that occurred in SA are also linked to international factors. According to Sola Akinrinade and Amadu Sesay in their book Africa in the Post-Cold War International System (eds.) the external factor that influenced transition in South Africa includes democratization in Eastern Europe and the End of Cold War. ... The limited freedom of expression saw most opposition parties denied access to the media when conducting their political functions. The media content was normally dominated by news on the authoritarian governments. This had to be curbed with revolution being the only effective tool (Decalo 20).4 Another factor suggested by Decalo is the institutional factor (25-35).5 Most of the dynamics that characterized the negotiations were institutionalized in the post apartheid period. This led to a significant stability and consolidation of democracy. The rules, norms, formal and informal principles were widely accepted by the majority making the transition process possible. According to Decalo, the most crucial dynamic that underwent institutionalization is constitutionalism whereby all political groupings and civil organizations accepted the rule of law. The democratic changes that occurred in SA are also linked to international factors. According to Sola Akinrinade and Amadu Sesay in their book Africa in the Post-Cold War International System (eds.) the external factor that influenced transition in South Africa includes democratization in Eastern Europe and the End of Cold War. The end of World War II saw a rise in global political struggle for power between the United States and its associates from the West, and the Soviet Union and the Warsaw Pact, allies of the Soviet Union in Eastern Europe (Akinrinade & Sesay, 92-128).6 According to Akinrinade and Sesay (1998), the Eastern Europe group had less developed governments democratically and in the 1980s, the Soviet Union and its Eastern Europe allies went through vigorous democratic transitions, a period that also saw East and South East Asian countries leave

Monday, November 18, 2019

WORLDCOM Essay Example | Topics and Well Written Essays - 1250 words

WORLDCOM - Essay Example (1995), and MFS Communications Company (1996). WorldCom also acquired the mother company of Digex – the Intermedia Communications and sold all of its intermedia’s non-Digex assets to Allegiance Telecom. Until 2003, WorldCom was considered a telecom giant and second largest long distance provider in the U.S. A one-time high flyer like WorldCom should be held legally, ethically and socially responsible for having entered into a lot of business contracts with its customers and suppliers. WorldCom should be responsible for its intensive mergers and failure to use the public funds (coming from public shareholders) carefully. It remains questionable for a huge company that has been consistently greedy in entering into mergers with other similar companies to suddenly file a bankruptcy last July 2002. As of 2005, WorldCom is still facing some court trials regarding this matter. It is the legal responsibility of WorldCom’s employers to ensure that the company directors operate within the society’s accepted laws and regulations. It is their legal responsibility to register and communicate with the shareholders and ensure them that dividends are paid on time. Top management should also monitor on the company’s financial statement. WorldCom is facing huge financial and legal problems. The company is considered to have defrauded its investors by overstating the company’s earnings up to nearly $10 billion wherein the top management of the WorldCom also gain some profits from their own criminal acts. WorldCom has to be held responsible for taking investors’ money in excess of $176 billion and causing WorldCom’s employees, the state pension funds and shareholders through the lost of jobs, worthless stocks, and losses of 401 (k) savings. 2 The act of overstating of the company’s earning is clearly a criminal act and it is punishable by law. One way or another, someone has to be held responsible for such unprofessional

Friday, November 15, 2019

Credit Risk Dissertation

Credit Risk Dissertation CREDIT RISK EXECUTIVE SUMMARY The future of banking will undoubtedly rest on risk management dynamics. Only those banks that have efficient risk management system will survive in the market in the long run. The major cause of serious banking problems over the years continues to be directly related to lax credit standards for borrowers and counterparties, poor portfolio risk management, or a lack of attention to deterioration in the credit standing of a banks counterparties. Credit risk is the oldest and biggest risk that bank, by virtue of its very nature of business, inherits. This has however, acquired a greater significance in the recent past for various reasons. There have been many traditional approaches to measure credit risk like logit, linear probability model but with passage of time new approaches have been developed like the Credit+, KMV Model. Basel I Accord was introduced in 1988 to have a framework for regulatory capital for banks but the â€Å"one size fit all† approach led to a shift, to a new and comprehensive approach -Basel II which adopts a three pillar approach to risk management. Banks use a number of techniques to mitigate the credit risks to which they are exposed. RBI has prescribed adoption of comprehensive approach for the purpose of CRM which allows fuller offset of security of collateral against exposures by effectively reducing the exposure amount by the value ascribed to the collateral. In this study, a leading nationalized bank is taken to study the steps taken by the bank to implement the Basel- II Accord and the entire framework developed for credit risk management. The bank under the study uses the credit scoring method to evaluate the credit risk involved in various loans/advances. The bank has set up special software to evaluate each case under various parameters and a monitoring system to continuously track each assets performance in accordance with the evaluation parameters. CHAPTER 1 INTRODUCTION 1.1 Rationale Credit Risk Management in todays deregulated market is a big challenge. Increased market volatility has brought with it the need for smart analysis and specialized applications in managing credit risk. A well defined policy framework is needed to help the operating staff identify the risk-event, assign a probability to each, quantify the likely loss, assess the acceptability of the exposure, price the risk and monitor them right to the point where they are paid off. Generally, Banks in India evaluate a proposal through the traditional tools of project financing, computing maximum permissible limits, assessing management capabilities and prescribing a ceiling for an industry exposure. As banks move in to a new high powered world of financial operations and trading, with new risks, the need is felt for more sophisticated and versatile instruments for risk assessment, monitoring and controlling risk exposures. It is, therefore, time that banks managements equip them fully to grapple with the demands of creating tools and systems capable of assessing, monitoring and controlling risk exposures in a more scientific manner. According to an estimate, Credit Risk takes about 70% and 30% remaining is shared between the other two primary risks, namely Market risk (change in the market price and operational risk i.e., failure of internal controls, etc.). Quality borrowers (Tier-I borrowers) were able to access the capital market directly without going through the debt route. Hence, the credit route is now more open to lesser mortals (Tier-II borrowers). With margin levels going down, banks are unable to absorb the level of loan losses. Even in banks which regularly fine-tune credit policies and streamline credit processes, it is a real challenge for credit risk managers to correctly identify pockets of risk concentration, quantify extent of risk carried, identify opportunities for diversification and balance the risk-return trade-off in their credit portfolio. The management of banks should strive to embrace the notion of ‘uncertainty and risk in their balance sheet and instill the need for approaching credit administration from a ‘risk-perspective across the system by placing well drafted strategies in the hands of the operating staff with due material support for its successful implementation. There is a need for Strategic approach to Credit Risk Management (CRM) in Indian Commercial Banks, particularly in view of; (1) Higher NPAs level in comparison with global benchmark (2) RBI s stipulation about dividend distribution by the banks (3) Revised NPAs level and CAR norms (4) New Basel Capital Accord (Basel -II) revolution 1.2 OBJECTIVES To understand the conceptual framework for credit risk. To understand credit risk under the Basel II Accord. To analyze the credit risk management practices in a Leading Nationalised Bank 1.3 RESEARCH METHODOLOGY Research Design: In order to have more comprehensive definition of the problem and to become familiar with the problems, an extensive literature survey was done to collect secondary data for the location of the various variables, probably contemporary issues and the clarity of concepts. Data Collection Techniques: The data collection technique used is interviewing. Data has been collected from both primary and secondary sources. Primary Data: is collected by making personal visits to the bank. Secondary Data: The details have been collected from research papers, working papers, white papers published by various agencies like ICRA, FICCI, IBA etc; articles from the internet and various journals. 1.4 LITERATURE REVIEW * Merton (1974) has applied options pricing model as a technology to evaluate the credit risk of enterprise, it has been drawn a lot of attention from western academic and business circles.Mertons Model is the theoretical foundation of structural models. Mertons model is not only based on a strict and comprehensive theory but also used market information stock price as an important variance toevaluate the credit risk.This makes credit risk to be a real-time monitored at a much higher frequency.This advantage has made it widely applied by the academic and business circle for a long time. Other Structural Models try to refine the original Merton Framework by removing one or more of unrealistic assumptions. * Black and Cox (1976) postulate that defaults occur as soon as firms asset value falls below a certain threshold. In contrast to the Merton approach, default can occur at any time. The paper by Black and Cox (1976) is the first of the so-called First Passage Models (FPM). First passage models specify default as the first time the firms asset value hits a lower barrier, allowing default to take place at any time. When the default barrier is exogenously fixed, as in Black and Cox (1976) and Longstaff and Schwartz (1995), it acts as a safety covenant to protect bondholders. Black and Cox introduce the possibility of more complex capital structures, with subordinated debt. * Geske (1977) introduces interest-paying debt to the Merton model. * Vasicek (1984) introduces the distinction between short and long term liabilities which now represents a distinctive feature of the KMV model. Under these models, all the relevant credit risk elements, including default and recovery at default, are a function of the structural characteristics of the firm: asset levels, asset volatility (business risk) and leverage (financial risk). * Kim, Ramaswamy and Sundaresan (1993) have suggested an alternative approach which still adopts the original Merton framework as far as the default process is concerned but, at the same time, removes one of the unrealistic assumptions of the Merton model; namely, that default can occur only at maturity of the debt when the firms assets are no longer sufficient to cover debt obligations. Instead, it is assumed that default may occur anytime between the issuance and maturity of the debt and that default is triggered when the value of the firms assets reaches a lower threshold level. In this model, the RR in the event of default is exogenous and independent from the firms asset value. It is generally defined as a fixed ratio of the outstanding debt value and is therefore independent from the PD. The attempt to overcome the shortcomings of structural-form models gave rise to reduced-form models. Unlike structural-form models, reduced-form models do not condition default on the value of the firm, and parameters related to the firms value need not be estimated to implement them. * Jarrow and Turnbull (1995) assumed that, at default, a bond would have a market value equal to an exogenously specified fraction of an otherwise equivalent default-free bond. * Duffie and Singleton (1999) followed with a model that, when market value at default (i.e. RR) is exogenously specified, allows for closed-form solutions for the term-structure of credit spreads. * Zhou (2001) attempt to combine the advantages of structural-form models a clear economic mechanism behind the default process, and the ones of reduced- form models unpredictability of default. This model links RRs to the firm value at default so that the variation in RRs is endogenously generated and the correlation between RRs and credit ratings reported first in Altman (1989) and Gupton, Gates and Carty (2000) is justified. Lately portfolio view on credit losses has emerged by recognising that changes in credit quality tend to comove over the business cycle and that one can diversify part of the credit risk by a clever composition of the loan portfolio across regions, industries and countries. Thus in order to assess the credit risk of a loan portfolio, a bank must not only investigate the creditworthiness of its customers, but also identify the concentration risks and possible comovements of risk factors in the portfolio. * CreditMetrics by Gupton et al (1997) was publicized in 1997 by JP Morgan. Its methodology is based on probability of moving from one credit quality to another within a given time horizon (credit migration analysis). The estimation of the portfolio Value-at-Risk due to Credit (Credit-VaR) through CreditMetrics A rating system with probabilities of migrating from one credit quality to another over a given time horizon (transition matrix) is the key component of the credit-VaR proposed by JP Morgan. The specified credit risk horizon is usually one year. A rating system with probabilities of migrating from one credit quality to another over a given time horizon (transition matrix) is the key component of the credit-VaR proposed by JP Morgan. The specified credit risk horizon is usually one year. * (Sy, 2007), states that the primary cause of credit default is loan delinquency due to insufficient liquidity or cash flow to service debt obligations. In the case of unsecured loans, we assume delinquency is a necessary and sufficient condition. In the case of collateralized loans, delinquency is a necessary, but not sufficient condition, because the borrower may be able to refinance the loan from positive equity or net assets to prevent default. In general, for secured loans, both delinquency and insolvency are assumed necessary and sufficient for credit default. CHAPTER 2 THEORECTICAL FRAMEWORK 2.1 CREDIT RISK: Credit risk is risk due to uncertainty in a counterpartys (also called an obligors or credits) ability to meet its obligations. Because there are many types of counterparties—from individuals to sovereign governments—and many different types of obligations—from auto loans to derivatives transactions—credit risk takes many forms. Institutions manage it in different ways. Although credit losses naturally fluctuate over time and with economic conditions, there is (ceteris paribus) a statistically measured, long-run average loss level. The losses can be divided into two categories i.e. expected losses (EL) and unexpected losses (UL). EL is based on three parameters:  ·Ã¢â€š ¬Ã‚   The likelihood that default will take place over a specified time horizon (probability of default or PD)  · â‚ ¬Ã‚  The amount owned by the counterparty at the moment of default (exposure at default or EAD)  ·Ã¢â€š ¬Ã‚   The fraction of the exposure, net of any recoveries, which will be lost following a default event (loss given default or LGD). EL = PD x EAD x LGD EL can be aggregated at various different levels (e.g. individual loan or entire credit portfolio), although it is typically calculated at the transaction level; it is normally mentioned either as an absolute amount or as a percentage of transaction size. It is also both customer- and facility-specific, since two different loans to the same customer can have a very different EL due to differences in EAD and/or LGD. It is important to note that EL (or, for that matter, credit quality) does not by itself constitute risk; if losses always equaled their expected levels, then there would be no uncertainty. Instead, EL should be viewed as an anticipated â€Å"cost of doing business† and should therefore be incorporated in loan pricing and ex ante provisioning. Credit risk, in fact, arises from variations in the actual loss levels, which give rise to the so-called unexpected loss (UL). Statistically speaking, UL is simply the standard deviation of EL. UL= ÏÆ' (EL) = ÏÆ' (PD*EAD*LGD) Once the bank- level credit loss distribution is constructed, credit economic capital is simply determined by the banks tolerance for credit risk, i.e. the bank needs to decide how much capital it wants to hold in order to avoid insolvency because of unexpected credit losses over the next year. A safer bank must have sufficient capital to withstand losses that are larger and rarer, i.e. they extend further out in the loss distribution tail. In practice, therefore, the choice of confidence interval in the loss distribution corresponds to the banks target credit rating (and related default probability) for its own debt. As Figure below shows, economic capital is the difference between EL and the selected confidence interval at the tail of the loss distribution; it is equal to a multiple K (often referred to as the capital multiplier) of the standard deviation of EL (i.e. UL). The shape of the loss distribution can vary considerably depending on product type and borrower credit quality. For example, high quality (low PD) borrowers tend to have proportionally less EL per unit of capital charged, meaning that K is higher and the shape of their loss distribution is more skewed (and vice versa). Credit risk may be in the following forms: * In case of the direct lending * In case of the guarantees and the letter of the credit * In case of the treasury operations * In case of the securities trading businesses * In case of the cross border exposure 2.2 The need for Credit Risk Rating: The need for Credit Risk Rating has arisen due to the following: 1. With dismantling of State control, deregulation, globalisation and allowing things to shape on the basis of market conditions, Indian Industry and Indian Banking face new risks and challenges. Competition results in the survival of the fittest. It is therefore necessary to identify these risks, measure them, monitor and control them. 2. It provides a basis for Credit Risk Pricing i.e. fixation of rate of interest on lending to different borrowers based on their credit risk rating thereby balancing Risk Reward for the Bank. 3. The Basel Accord and consequent Reserve Bank of India guidelines requires that the level of capital required to be maintained by the Bank will be in proportion to the risk of the loan in Banks Books for measurement of which proper Credit Risk Rating system is necessary. 4. The credit risk rating can be a Risk Management tool for prospecting fresh borrowers in addition to monitoring the weaker parameters and taking remedial action. The types of Risks Captured in the Banks Credit Risk Rating Model The Credit Risk Rating Model provides a framework to evaluate the risk emanating from following main risk categorizes/risk areas: * Industry risk * Business risk * Financial risk * Management risk * Facility risk * Project risk 2.3 WHY CREDIT RISK MEASUREMENT? In recent years, a revolution is brewing in risk as it is both managed and measured. There are seven reasons as to why certain surge in interest: 1. Structural increase in bankruptcies: Although the most recent recession hit at different time in different countries, most statistics show a significant increase in bankruptcies, compared to prior recession. To the extent that there has been a permanent or structural increase in bankruptcies worldwide- due to increase in the global competition- accurate credit analysis become even more important today than in past. 2. Disintermediation: As capital markets have expanded and become accessible to small and mid sized firms, the firms or borrowers â€Å"left behind† to raise funds from banks and other traditional financial institutions (FIs) are likely to be smaller and to have weaker credit ratings. Capital market growth has produced â€Å"a winners† curse effect on the portfolios of traditional FIs. 3. More Competitive Margins: Almost paradoxically, despite the decline in the average quality of loans, interest margins or spreads, especially in wholesale loan markets have become very thin. In short, the risk-return trade off from lending has gotten worse. A number of reasons can be cited, but an important factor has been the enhanced competition for low quality borrowers especially from finance companies, much of whose lending activity has been concentrated at the higher risk/lower quality end of the market. 4. Declining and Volatile Values of Collateral: Concurrent with the recent Asian and Russian debt crisis in well developed countries such as Switzerland and Japan have shown that property and real assets value are very hard to predict, and to realize through liquidation. The weaker (and more uncertain) collateral values are, the riskier the lending is likely to be. Indeed the current concerns about deflation worldwide have been accentuated the concerns about the value of real assets such as property and other physical assets. 5. The Growth Of Off- Balance Sheet Derivatives: In many of the very large U.S. banks, the notional value of the off-balance-sheet exposure to instruments such as over-the-counter (OTC) swaps and forwards is more than 10 times the size of their loan books. Indeed the growth in credit risk off the balance sheet was one of the main reasons for the introduction, by the Bank for International Settlements (BIS), of risk based capital requirements in 1993. Under the BIS system, the banks have to hold a capital requirement based on the mark- to- market current values of each OTC Derivative contract plus an add on for potential future exposure. 6. Technology Advances in computer systems and related advances in information technology have given banks and FIs the opportunity to test high powered modeling techniques. A survey conducted by International Swaps and Derivatives Association and the Institute of International Finance in 2000 found that survey participants (consisting of 25 commercial banks from 10 countries, with varying size and specialties) used commercial and internal databases to assess the credit risk on rated and unrated commercial, retail and mortgage loans. 7. The BIS Risk-Based Capital Requirements Despite the importance of above six reasons, probably the greatest incentive for banks to develop new credit risk models has been dissatisfaction with the BIS and central banks post-1992 imposition of capital requirements on loans. The current BIS approach has been described as a ‘one size fits all policy, irrespective of the size of loan, its maturity, and most importantly, the credit quality of the borrowing party. Much of the current interest in fine tuning credit risk measurement models has been fueled by the proposed BIS New Capital Accord (or so Called BIS II) which would more closely link capital charges to the credit risk exposure to retail, commercial, sovereign and interbank credits. Chapter- 3 Credit Risk Approaches and Pricing 3.1 CREDIT RISK MEASUREMENT APPROACHES: 1. CREDIT SCORING MODELS Credit Scoring Models use data on observed borrower characteristics to calculate the probability of default or to sort borrowers into different default risk classes. By selecting and combining different economic and financial borrower characteristics, a bank manager may be able to numerically establish which factors are important in explaining default risk, evaluate the relative degree or importance of these factors, improve the pricing of default risk, be better able to screen out bad loan applicants and be in a better position to calculate any reserve needed to meet expected future loan losses. To employ credit scoring model in this manner, the manager must identify objective economic and financial measures of risk for any particular class of borrower. For consumer debt, the objective characteristics in a credit -scoring model might include income, assets, age occupation and location. For corporate debt, financial ratios such as debt-equity ratio are usually key factors. After data are identified, a statistical technique quantifies or scores the default risk probability or default risk classification. Credit scoring models include three broad types: (1) linear probability models, (2) logit model and (3) linear discriminant model. LINEAR PROBABILITY MODEL: The linear probability model uses past data, such as accounting ratios, as inputs into a model to explain repayment experience on old loans. The relative importance of the factors used in explaining the past repayment performance then forecasts repayment probabilities on new loans; that is can be used for assessing the probability of repayment. Briefly we divide old loans (i) into two observational groups; those that defaulted (Zi = 1) and those that did not default (Zi = 0). Then we relate these observations by linear regression to s set of j casual variables (Xij) that reflects quantative information about the ith borrower, such as leverage or earnings. We estimate the model by linear regression of: Zi = ÃŽ £ÃŽ ²jXij + error Where ÃŽ ²j is the estimated importance of the jth variable in explaining past repayment experience. If we then take these estimated ÃŽ ²js and multiply them by the observed Xij for a prospective borrower, we can derive an expected value of Zi for the probability of repayment on the loan. LOGIT MODEL: The objective of the typical credit or loan review model is to replicate judgments made by loan officers, credit managers or bank examiners. If an accurate model could be developed, then it could be used as a tool for reviewing and classifying future credit risks. Chesser (1974) developed a model to predict noncompliance with the customers original loan arrangement, where non-compliance is defined to include not only default but any workout that may have been arranged resulting in a settlement of the loan less favorable to the tender than the original agreement. Chessers model, which was based on a technique called logit analysis, consisted of the following six variables. X1 = (Cash + Marketable Securities)/Total Assets X2 = Net Sales/(Cash + Marketable Securities) X3 = EBIT/Total Assets X4 = Total Debt/Total Assets X5 = Total Assets/ Net Worth X6 = Working Capital/Net Sales The estimated coefficients, including an intercept term, are Y = -2.0434 -5.24X1 + 0.0053X2 6.6507X3 + 4.4009X4 0.0791X5 0.1020X6 Chessers classification rule for above equation is If P> 50, assign to the non compliance group and If P≠¤50, assign to the compliance group. LINEAR DISCRIMINANT MODEL: While linear probability and logit models project a value foe the expected probability of default if a loan is made, discriminant models divide borrowers into high or default risk classes contingent on their observed characteristic (X). Altmans Z-score model is an application of multivariate Discriminant analysis in credit risk modeling. Financial ratios measuring probability, liquidity and solvency appeared to have significant discriminating power to separate the firm that fails to service its debt from the firms that do not. These ratios are weighted to produce a measure (credit risk score) that can be used as a metric to differentiate the bad firms from the set of good ones. Discriminant analysis is a multivariate statistical technique that analyzes a set of variables in order to differentiate two or more groups by minimizing the within-group variance and maximizing the between group variance simultaneously. Variables taken were: X1::Working Capital/ Total Asset X2: Retained Earning/ Total Asset X3: Earning before interest and taxes/ Total Asset X4: Market value of equity/ Book value of total Liabilities X5: Sales/Total Asset The original Z-score model was revised and modified several times in order to find the scoring model more specific to a particular class of firm. These resulted in the private firms Z-score model, non manufacturers Z-score model and Emerging Market Scoring (EMS) model. 3.2 New Approaches TERM STRUCTURE DERIVATION OF CREDIT RISK: One market based method of assessing credit risk exposure and default probabilities is to analyze the risk premium inherent in the current structure of yields on corporate debt or loans to similar risk-rated borrowers. Rating agencies categorize corporate bond issuers into at least seven major classes according to perceived credit quality. The first four ratings AAA, AA, A and BBB indicate investment quality borrowers. MORTALITY RATE APPROACH: Rather than extracting expected default rates from the current term structure of interest rates, the FI manager may analyze the historic or past default experience the mortality rates, of bonds and loans of a similar quality. Here p1is the probability of a grade B bond surviving the first year of its issue; thus 1 p1 is the marginal mortality rate, or the probability of the bond or loan dying or defaulting in the first year while p2 is the probability of the loan surviving in the second year and that it has not defaulted in the first year, 1-p2 is the marginal mortality rate for the second year. Thus, for each grade of corporate buyer quality, a marginal mortality rate (MMR) curve can show the historical default rate in any specific quality class in each year after issue. RAROC MODELS: Based on a banks risk-bearing capacity and its risk strategy, it is thus necessary — bearing in mind the banks strategic orientation — to find a method for the efficient allocation of capital to the banks individual siness areas, i.e. to define indicators that are suitable for balancing risk and return in a sensible manner. Indicators fulfilling this requirement are often referred to as risk adjusted performance measures (RAPM). RARORAC (risk adjusted return on risk adjusted capital, usually abbreviated as the most commonly found forms are RORAC (return on risk adjusted capital), Net income is taken to mean income minus refinancing cost, operating cost, and expected losses. It should now be the banks goal to maximize a RAPM indicator for the bank as a whole, e.g. RORAC, taking into account the correlation between individual transactions. Certain constraints such as volume restrictions due to a potential lack of liquidity and the maintenance of solvency based on economic and regulatory capital have to be observed in reaching this goal. From an organizational point of view, value and risk management should therefore be linked as closely as possible at all organizational levels. OPTION MODELS OF DEFAULT RISK (kmv model): KMV Corporation has developed a credit risk model that uses information on the stock prices and the capital structure of the firm to estimate its default probability. The starting point of the model is the proposition that a firm will default only if its asset value falls below a certain level, which is function of its liability. It estimates the asset value of the firm and its asset volatility from the market value of equity and the debt structure in the option theoretic framework. The resultant probability is called Expected default Frequency (EDF). In summary, EDF is calculated in the following three steps: i) Estimation of asset value and volatility from the equity value and volatility of equity return. ii) Calculation of distance from default iii) Calculation of expected default frequency Credit METRICS: It provides a method for estimating the distribution of the value of the assets n a portfolio subject to change in the credit quality of individual borrower. A portfolio consists of different stand-alone assets, defined by a stream of future cash flows. Each asset has a distribution over the possible range of future rating class. Starting from its initial rating, an asset may end up in ay one of the possible rating categories. Each rating category has a different credit spread, which will be used to discount the future cash flows. Moreover, the assets are correlated among themselves depending on the industry they belong to. It is assumed that the asset returns are normally distributed and change in the asset returns causes the change in the rating category in future. Finally, the simulation technique is used to estimate the value distribution of the assets. A number of scenario are generated from a multivariate normal distribution, which is defined by the appropriate credit spread, t he future value of asset is estimated. CREDIT Risk+: CreditRisk+, introduced by Credit Suisse Financial Products (CSFP), is a model of default risk. Each asset has only two possible end-of-period states: default and non-default. In the event of default, the lender recovers a fixed proportion of the total expense. The default rate is considered as a continuous random variable. It does not try to estimate default correlation directly. Here, the default correlation is assumed to be determined by a set of risk factors. Conditional on these risk factors, default of each obligator follows a Bernoulli distribution. To get unconditional probability generating function for the number of defaults, it assumes that the risk factors are independently gamma distributed random variables. The final step in Creditrisk+ is to obtain the probability generating function for losses. Conditional on the number of default events, the losses are entirely determined by the exposure and recovery rate. Thus, the distribution of asset can be estimated from the fol lowing input data: i) Exposure of individual asset ii) Expected default rate iii) Default ate volatilities iv) Recovery rate given default 3.3 CREDIT PRICING Pricing of the credit is essential for the survival of enterprises relying on credit assets, because the benefits derived from extending credit should surpass the cost. With the introduction of capital adequacy norms, the credit risk is linked to the capital-minimum 8% capital adequacy. Consequently, higher capital is required to be deployed if more credit risks are underwritten. The decision (a) whether to maximize the returns on possible credit assets with the existing capital or (b) raise more capital to do more business invariably depends upon p Credit Risk Dissertation Credit Risk Dissertation CREDIT RISK EXECUTIVE SUMMARY The future of banking will undoubtedly rest on risk management dynamics. Only those banks that have efficient risk management system will survive in the market in the long run. The major cause of serious banking problems over the years continues to be directly related to lax credit standards for borrowers and counterparties, poor portfolio risk management, or a lack of attention to deterioration in the credit standing of a banks counterparties. Credit risk is the oldest and biggest risk that bank, by virtue of its very nature of business, inherits. This has however, acquired a greater significance in the recent past for various reasons. There have been many traditional approaches to measure credit risk like logit, linear probability model but with passage of time new approaches have been developed like the Credit+, KMV Model. Basel I Accord was introduced in 1988 to have a framework for regulatory capital for banks but the â€Å"one size fit all† approach led to a shift, to a new and comprehensive approach -Basel II which adopts a three pillar approach to risk management. Banks use a number of techniques to mitigate the credit risks to which they are exposed. RBI has prescribed adoption of comprehensive approach for the purpose of CRM which allows fuller offset of security of collateral against exposures by effectively reducing the exposure amount by the value ascribed to the collateral. In this study, a leading nationalized bank is taken to study the steps taken by the bank to implement the Basel- II Accord and the entire framework developed for credit risk management. The bank under the study uses the credit scoring method to evaluate the credit risk involved in various loans/advances. The bank has set up special software to evaluate each case under various parameters and a monitoring system to continuously track each assets performance in accordance with the evaluation parameters. CHAPTER 1 INTRODUCTION 1.1 Rationale Credit Risk Management in todays deregulated market is a big challenge. Increased market volatility has brought with it the need for smart analysis and specialized applications in managing credit risk. A well defined policy framework is needed to help the operating staff identify the risk-event, assign a probability to each, quantify the likely loss, assess the acceptability of the exposure, price the risk and monitor them right to the point where they are paid off. Generally, Banks in India evaluate a proposal through the traditional tools of project financing, computing maximum permissible limits, assessing management capabilities and prescribing a ceiling for an industry exposure. As banks move in to a new high powered world of financial operations and trading, with new risks, the need is felt for more sophisticated and versatile instruments for risk assessment, monitoring and controlling risk exposures. It is, therefore, time that banks managements equip them fully to grapple with the demands of creating tools and systems capable of assessing, monitoring and controlling risk exposures in a more scientific manner. According to an estimate, Credit Risk takes about 70% and 30% remaining is shared between the other two primary risks, namely Market risk (change in the market price and operational risk i.e., failure of internal controls, etc.). Quality borrowers (Tier-I borrowers) were able to access the capital market directly without going through the debt route. Hence, the credit route is now more open to lesser mortals (Tier-II borrowers). With margin levels going down, banks are unable to absorb the level of loan losses. Even in banks which regularly fine-tune credit policies and streamline credit processes, it is a real challenge for credit risk managers to correctly identify pockets of risk concentration, quantify extent of risk carried, identify opportunities for diversification and balance the risk-return trade-off in their credit portfolio. The management of banks should strive to embrace the notion of ‘uncertainty and risk in their balance sheet and instill the need for approaching credit administration from a ‘risk-perspective across the system by placing well drafted strategies in the hands of the operating staff with due material support for its successful implementation. There is a need for Strategic approach to Credit Risk Management (CRM) in Indian Commercial Banks, particularly in view of; (1) Higher NPAs level in comparison with global benchmark (2) RBI s stipulation about dividend distribution by the banks (3) Revised NPAs level and CAR norms (4) New Basel Capital Accord (Basel -II) revolution 1.2 OBJECTIVES To understand the conceptual framework for credit risk. To understand credit risk under the Basel II Accord. To analyze the credit risk management practices in a Leading Nationalised Bank 1.3 RESEARCH METHODOLOGY Research Design: In order to have more comprehensive definition of the problem and to become familiar with the problems, an extensive literature survey was done to collect secondary data for the location of the various variables, probably contemporary issues and the clarity of concepts. Data Collection Techniques: The data collection technique used is interviewing. Data has been collected from both primary and secondary sources. Primary Data: is collected by making personal visits to the bank. Secondary Data: The details have been collected from research papers, working papers, white papers published by various agencies like ICRA, FICCI, IBA etc; articles from the internet and various journals. 1.4 LITERATURE REVIEW * Merton (1974) has applied options pricing model as a technology to evaluate the credit risk of enterprise, it has been drawn a lot of attention from western academic and business circles.Mertons Model is the theoretical foundation of structural models. Mertons model is not only based on a strict and comprehensive theory but also used market information stock price as an important variance toevaluate the credit risk.This makes credit risk to be a real-time monitored at a much higher frequency.This advantage has made it widely applied by the academic and business circle for a long time. Other Structural Models try to refine the original Merton Framework by removing one or more of unrealistic assumptions. * Black and Cox (1976) postulate that defaults occur as soon as firms asset value falls below a certain threshold. In contrast to the Merton approach, default can occur at any time. The paper by Black and Cox (1976) is the first of the so-called First Passage Models (FPM). First passage models specify default as the first time the firms asset value hits a lower barrier, allowing default to take place at any time. When the default barrier is exogenously fixed, as in Black and Cox (1976) and Longstaff and Schwartz (1995), it acts as a safety covenant to protect bondholders. Black and Cox introduce the possibility of more complex capital structures, with subordinated debt. * Geske (1977) introduces interest-paying debt to the Merton model. * Vasicek (1984) introduces the distinction between short and long term liabilities which now represents a distinctive feature of the KMV model. Under these models, all the relevant credit risk elements, including default and recovery at default, are a function of the structural characteristics of the firm: asset levels, asset volatility (business risk) and leverage (financial risk). * Kim, Ramaswamy and Sundaresan (1993) have suggested an alternative approach which still adopts the original Merton framework as far as the default process is concerned but, at the same time, removes one of the unrealistic assumptions of the Merton model; namely, that default can occur only at maturity of the debt when the firms assets are no longer sufficient to cover debt obligations. Instead, it is assumed that default may occur anytime between the issuance and maturity of the debt and that default is triggered when the value of the firms assets reaches a lower threshold level. In this model, the RR in the event of default is exogenous and independent from the firms asset value. It is generally defined as a fixed ratio of the outstanding debt value and is therefore independent from the PD. The attempt to overcome the shortcomings of structural-form models gave rise to reduced-form models. Unlike structural-form models, reduced-form models do not condition default on the value of the firm, and parameters related to the firms value need not be estimated to implement them. * Jarrow and Turnbull (1995) assumed that, at default, a bond would have a market value equal to an exogenously specified fraction of an otherwise equivalent default-free bond. * Duffie and Singleton (1999) followed with a model that, when market value at default (i.e. RR) is exogenously specified, allows for closed-form solutions for the term-structure of credit spreads. * Zhou (2001) attempt to combine the advantages of structural-form models a clear economic mechanism behind the default process, and the ones of reduced- form models unpredictability of default. This model links RRs to the firm value at default so that the variation in RRs is endogenously generated and the correlation between RRs and credit ratings reported first in Altman (1989) and Gupton, Gates and Carty (2000) is justified. Lately portfolio view on credit losses has emerged by recognising that changes in credit quality tend to comove over the business cycle and that one can diversify part of the credit risk by a clever composition of the loan portfolio across regions, industries and countries. Thus in order to assess the credit risk of a loan portfolio, a bank must not only investigate the creditworthiness of its customers, but also identify the concentration risks and possible comovements of risk factors in the portfolio. * CreditMetrics by Gupton et al (1997) was publicized in 1997 by JP Morgan. Its methodology is based on probability of moving from one credit quality to another within a given time horizon (credit migration analysis). The estimation of the portfolio Value-at-Risk due to Credit (Credit-VaR) through CreditMetrics A rating system with probabilities of migrating from one credit quality to another over a given time horizon (transition matrix) is the key component of the credit-VaR proposed by JP Morgan. The specified credit risk horizon is usually one year. A rating system with probabilities of migrating from one credit quality to another over a given time horizon (transition matrix) is the key component of the credit-VaR proposed by JP Morgan. The specified credit risk horizon is usually one year. * (Sy, 2007), states that the primary cause of credit default is loan delinquency due to insufficient liquidity or cash flow to service debt obligations. In the case of unsecured loans, we assume delinquency is a necessary and sufficient condition. In the case of collateralized loans, delinquency is a necessary, but not sufficient condition, because the borrower may be able to refinance the loan from positive equity or net assets to prevent default. In general, for secured loans, both delinquency and insolvency are assumed necessary and sufficient for credit default. CHAPTER 2 THEORECTICAL FRAMEWORK 2.1 CREDIT RISK: Credit risk is risk due to uncertainty in a counterpartys (also called an obligors or credits) ability to meet its obligations. Because there are many types of counterparties—from individuals to sovereign governments—and many different types of obligations—from auto loans to derivatives transactions—credit risk takes many forms. Institutions manage it in different ways. Although credit losses naturally fluctuate over time and with economic conditions, there is (ceteris paribus) a statistically measured, long-run average loss level. The losses can be divided into two categories i.e. expected losses (EL) and unexpected losses (UL). EL is based on three parameters:  ·Ã¢â€š ¬Ã‚   The likelihood that default will take place over a specified time horizon (probability of default or PD)  · â‚ ¬Ã‚  The amount owned by the counterparty at the moment of default (exposure at default or EAD)  ·Ã¢â€š ¬Ã‚   The fraction of the exposure, net of any recoveries, which will be lost following a default event (loss given default or LGD). EL = PD x EAD x LGD EL can be aggregated at various different levels (e.g. individual loan or entire credit portfolio), although it is typically calculated at the transaction level; it is normally mentioned either as an absolute amount or as a percentage of transaction size. It is also both customer- and facility-specific, since two different loans to the same customer can have a very different EL due to differences in EAD and/or LGD. It is important to note that EL (or, for that matter, credit quality) does not by itself constitute risk; if losses always equaled their expected levels, then there would be no uncertainty. Instead, EL should be viewed as an anticipated â€Å"cost of doing business† and should therefore be incorporated in loan pricing and ex ante provisioning. Credit risk, in fact, arises from variations in the actual loss levels, which give rise to the so-called unexpected loss (UL). Statistically speaking, UL is simply the standard deviation of EL. UL= ÏÆ' (EL) = ÏÆ' (PD*EAD*LGD) Once the bank- level credit loss distribution is constructed, credit economic capital is simply determined by the banks tolerance for credit risk, i.e. the bank needs to decide how much capital it wants to hold in order to avoid insolvency because of unexpected credit losses over the next year. A safer bank must have sufficient capital to withstand losses that are larger and rarer, i.e. they extend further out in the loss distribution tail. In practice, therefore, the choice of confidence interval in the loss distribution corresponds to the banks target credit rating (and related default probability) for its own debt. As Figure below shows, economic capital is the difference between EL and the selected confidence interval at the tail of the loss distribution; it is equal to a multiple K (often referred to as the capital multiplier) of the standard deviation of EL (i.e. UL). The shape of the loss distribution can vary considerably depending on product type and borrower credit quality. For example, high quality (low PD) borrowers tend to have proportionally less EL per unit of capital charged, meaning that K is higher and the shape of their loss distribution is more skewed (and vice versa). Credit risk may be in the following forms: * In case of the direct lending * In case of the guarantees and the letter of the credit * In case of the treasury operations * In case of the securities trading businesses * In case of the cross border exposure 2.2 The need for Credit Risk Rating: The need for Credit Risk Rating has arisen due to the following: 1. With dismantling of State control, deregulation, globalisation and allowing things to shape on the basis of market conditions, Indian Industry and Indian Banking face new risks and challenges. Competition results in the survival of the fittest. It is therefore necessary to identify these risks, measure them, monitor and control them. 2. It provides a basis for Credit Risk Pricing i.e. fixation of rate of interest on lending to different borrowers based on their credit risk rating thereby balancing Risk Reward for the Bank. 3. The Basel Accord and consequent Reserve Bank of India guidelines requires that the level of capital required to be maintained by the Bank will be in proportion to the risk of the loan in Banks Books for measurement of which proper Credit Risk Rating system is necessary. 4. The credit risk rating can be a Risk Management tool for prospecting fresh borrowers in addition to monitoring the weaker parameters and taking remedial action. The types of Risks Captured in the Banks Credit Risk Rating Model The Credit Risk Rating Model provides a framework to evaluate the risk emanating from following main risk categorizes/risk areas: * Industry risk * Business risk * Financial risk * Management risk * Facility risk * Project risk 2.3 WHY CREDIT RISK MEASUREMENT? In recent years, a revolution is brewing in risk as it is both managed and measured. There are seven reasons as to why certain surge in interest: 1. Structural increase in bankruptcies: Although the most recent recession hit at different time in different countries, most statistics show a significant increase in bankruptcies, compared to prior recession. To the extent that there has been a permanent or structural increase in bankruptcies worldwide- due to increase in the global competition- accurate credit analysis become even more important today than in past. 2. Disintermediation: As capital markets have expanded and become accessible to small and mid sized firms, the firms or borrowers â€Å"left behind† to raise funds from banks and other traditional financial institutions (FIs) are likely to be smaller and to have weaker credit ratings. Capital market growth has produced â€Å"a winners† curse effect on the portfolios of traditional FIs. 3. More Competitive Margins: Almost paradoxically, despite the decline in the average quality of loans, interest margins or spreads, especially in wholesale loan markets have become very thin. In short, the risk-return trade off from lending has gotten worse. A number of reasons can be cited, but an important factor has been the enhanced competition for low quality borrowers especially from finance companies, much of whose lending activity has been concentrated at the higher risk/lower quality end of the market. 4. Declining and Volatile Values of Collateral: Concurrent with the recent Asian and Russian debt crisis in well developed countries such as Switzerland and Japan have shown that property and real assets value are very hard to predict, and to realize through liquidation. The weaker (and more uncertain) collateral values are, the riskier the lending is likely to be. Indeed the current concerns about deflation worldwide have been accentuated the concerns about the value of real assets such as property and other physical assets. 5. The Growth Of Off- Balance Sheet Derivatives: In many of the very large U.S. banks, the notional value of the off-balance-sheet exposure to instruments such as over-the-counter (OTC) swaps and forwards is more than 10 times the size of their loan books. Indeed the growth in credit risk off the balance sheet was one of the main reasons for the introduction, by the Bank for International Settlements (BIS), of risk based capital requirements in 1993. Under the BIS system, the banks have to hold a capital requirement based on the mark- to- market current values of each OTC Derivative contract plus an add on for potential future exposure. 6. Technology Advances in computer systems and related advances in information technology have given banks and FIs the opportunity to test high powered modeling techniques. A survey conducted by International Swaps and Derivatives Association and the Institute of International Finance in 2000 found that survey participants (consisting of 25 commercial banks from 10 countries, with varying size and specialties) used commercial and internal databases to assess the credit risk on rated and unrated commercial, retail and mortgage loans. 7. The BIS Risk-Based Capital Requirements Despite the importance of above six reasons, probably the greatest incentive for banks to develop new credit risk models has been dissatisfaction with the BIS and central banks post-1992 imposition of capital requirements on loans. The current BIS approach has been described as a ‘one size fits all policy, irrespective of the size of loan, its maturity, and most importantly, the credit quality of the borrowing party. Much of the current interest in fine tuning credit risk measurement models has been fueled by the proposed BIS New Capital Accord (or so Called BIS II) which would more closely link capital charges to the credit risk exposure to retail, commercial, sovereign and interbank credits. Chapter- 3 Credit Risk Approaches and Pricing 3.1 CREDIT RISK MEASUREMENT APPROACHES: 1. CREDIT SCORING MODELS Credit Scoring Models use data on observed borrower characteristics to calculate the probability of default or to sort borrowers into different default risk classes. By selecting and combining different economic and financial borrower characteristics, a bank manager may be able to numerically establish which factors are important in explaining default risk, evaluate the relative degree or importance of these factors, improve the pricing of default risk, be better able to screen out bad loan applicants and be in a better position to calculate any reserve needed to meet expected future loan losses. To employ credit scoring model in this manner, the manager must identify objective economic and financial measures of risk for any particular class of borrower. For consumer debt, the objective characteristics in a credit -scoring model might include income, assets, age occupation and location. For corporate debt, financial ratios such as debt-equity ratio are usually key factors. After data are identified, a statistical technique quantifies or scores the default risk probability or default risk classification. Credit scoring models include three broad types: (1) linear probability models, (2) logit model and (3) linear discriminant model. LINEAR PROBABILITY MODEL: The linear probability model uses past data, such as accounting ratios, as inputs into a model to explain repayment experience on old loans. The relative importance of the factors used in explaining the past repayment performance then forecasts repayment probabilities on new loans; that is can be used for assessing the probability of repayment. Briefly we divide old loans (i) into two observational groups; those that defaulted (Zi = 1) and those that did not default (Zi = 0). Then we relate these observations by linear regression to s set of j casual variables (Xij) that reflects quantative information about the ith borrower, such as leverage or earnings. We estimate the model by linear regression of: Zi = ÃŽ £ÃŽ ²jXij + error Where ÃŽ ²j is the estimated importance of the jth variable in explaining past repayment experience. If we then take these estimated ÃŽ ²js and multiply them by the observed Xij for a prospective borrower, we can derive an expected value of Zi for the probability of repayment on the loan. LOGIT MODEL: The objective of the typical credit or loan review model is to replicate judgments made by loan officers, credit managers or bank examiners. If an accurate model could be developed, then it could be used as a tool for reviewing and classifying future credit risks. Chesser (1974) developed a model to predict noncompliance with the customers original loan arrangement, where non-compliance is defined to include not only default but any workout that may have been arranged resulting in a settlement of the loan less favorable to the tender than the original agreement. Chessers model, which was based on a technique called logit analysis, consisted of the following six variables. X1 = (Cash + Marketable Securities)/Total Assets X2 = Net Sales/(Cash + Marketable Securities) X3 = EBIT/Total Assets X4 = Total Debt/Total Assets X5 = Total Assets/ Net Worth X6 = Working Capital/Net Sales The estimated coefficients, including an intercept term, are Y = -2.0434 -5.24X1 + 0.0053X2 6.6507X3 + 4.4009X4 0.0791X5 0.1020X6 Chessers classification rule for above equation is If P> 50, assign to the non compliance group and If P≠¤50, assign to the compliance group. LINEAR DISCRIMINANT MODEL: While linear probability and logit models project a value foe the expected probability of default if a loan is made, discriminant models divide borrowers into high or default risk classes contingent on their observed characteristic (X). Altmans Z-score model is an application of multivariate Discriminant analysis in credit risk modeling. Financial ratios measuring probability, liquidity and solvency appeared to have significant discriminating power to separate the firm that fails to service its debt from the firms that do not. These ratios are weighted to produce a measure (credit risk score) that can be used as a metric to differentiate the bad firms from the set of good ones. Discriminant analysis is a multivariate statistical technique that analyzes a set of variables in order to differentiate two or more groups by minimizing the within-group variance and maximizing the between group variance simultaneously. Variables taken were: X1::Working Capital/ Total Asset X2: Retained Earning/ Total Asset X3: Earning before interest and taxes/ Total Asset X4: Market value of equity/ Book value of total Liabilities X5: Sales/Total Asset The original Z-score model was revised and modified several times in order to find the scoring model more specific to a particular class of firm. These resulted in the private firms Z-score model, non manufacturers Z-score model and Emerging Market Scoring (EMS) model. 3.2 New Approaches TERM STRUCTURE DERIVATION OF CREDIT RISK: One market based method of assessing credit risk exposure and default probabilities is to analyze the risk premium inherent in the current structure of yields on corporate debt or loans to similar risk-rated borrowers. Rating agencies categorize corporate bond issuers into at least seven major classes according to perceived credit quality. The first four ratings AAA, AA, A and BBB indicate investment quality borrowers. MORTALITY RATE APPROACH: Rather than extracting expected default rates from the current term structure of interest rates, the FI manager may analyze the historic or past default experience the mortality rates, of bonds and loans of a similar quality. Here p1is the probability of a grade B bond surviving the first year of its issue; thus 1 p1 is the marginal mortality rate, or the probability of the bond or loan dying or defaulting in the first year while p2 is the probability of the loan surviving in the second year and that it has not defaulted in the first year, 1-p2 is the marginal mortality rate for the second year. Thus, for each grade of corporate buyer quality, a marginal mortality rate (MMR) curve can show the historical default rate in any specific quality class in each year after issue. RAROC MODELS: Based on a banks risk-bearing capacity and its risk strategy, it is thus necessary — bearing in mind the banks strategic orientation — to find a method for the efficient allocation of capital to the banks individual siness areas, i.e. to define indicators that are suitable for balancing risk and return in a sensible manner. Indicators fulfilling this requirement are often referred to as risk adjusted performance measures (RAPM). RARORAC (risk adjusted return on risk adjusted capital, usually abbreviated as the most commonly found forms are RORAC (return on risk adjusted capital), Net income is taken to mean income minus refinancing cost, operating cost, and expected losses. It should now be the banks goal to maximize a RAPM indicator for the bank as a whole, e.g. RORAC, taking into account the correlation between individual transactions. Certain constraints such as volume restrictions due to a potential lack of liquidity and the maintenance of solvency based on economic and regulatory capital have to be observed in reaching this goal. From an organizational point of view, value and risk management should therefore be linked as closely as possible at all organizational levels. OPTION MODELS OF DEFAULT RISK (kmv model): KMV Corporation has developed a credit risk model that uses information on the stock prices and the capital structure of the firm to estimate its default probability. The starting point of the model is the proposition that a firm will default only if its asset value falls below a certain level, which is function of its liability. It estimates the asset value of the firm and its asset volatility from the market value of equity and the debt structure in the option theoretic framework. The resultant probability is called Expected default Frequency (EDF). In summary, EDF is calculated in the following three steps: i) Estimation of asset value and volatility from the equity value and volatility of equity return. ii) Calculation of distance from default iii) Calculation of expected default frequency Credit METRICS: It provides a method for estimating the distribution of the value of the assets n a portfolio subject to change in the credit quality of individual borrower. A portfolio consists of different stand-alone assets, defined by a stream of future cash flows. Each asset has a distribution over the possible range of future rating class. Starting from its initial rating, an asset may end up in ay one of the possible rating categories. Each rating category has a different credit spread, which will be used to discount the future cash flows. Moreover, the assets are correlated among themselves depending on the industry they belong to. It is assumed that the asset returns are normally distributed and change in the asset returns causes the change in the rating category in future. Finally, the simulation technique is used to estimate the value distribution of the assets. A number of scenario are generated from a multivariate normal distribution, which is defined by the appropriate credit spread, t he future value of asset is estimated. CREDIT Risk+: CreditRisk+, introduced by Credit Suisse Financial Products (CSFP), is a model of default risk. Each asset has only two possible end-of-period states: default and non-default. In the event of default, the lender recovers a fixed proportion of the total expense. The default rate is considered as a continuous random variable. It does not try to estimate default correlation directly. Here, the default correlation is assumed to be determined by a set of risk factors. Conditional on these risk factors, default of each obligator follows a Bernoulli distribution. To get unconditional probability generating function for the number of defaults, it assumes that the risk factors are independently gamma distributed random variables. The final step in Creditrisk+ is to obtain the probability generating function for losses. Conditional on the number of default events, the losses are entirely determined by the exposure and recovery rate. Thus, the distribution of asset can be estimated from the fol lowing input data: i) Exposure of individual asset ii) Expected default rate iii) Default ate volatilities iv) Recovery rate given default 3.3 CREDIT PRICING Pricing of the credit is essential for the survival of enterprises relying on credit assets, because the benefits derived from extending credit should surpass the cost. With the introduction of capital adequacy norms, the credit risk is linked to the capital-minimum 8% capital adequacy. Consequently, higher capital is required to be deployed if more credit risks are underwritten. The decision (a) whether to maximize the returns on possible credit assets with the existing capital or (b) raise more capital to do more business invariably depends upon p

Wednesday, November 13, 2019

Use of Rhetoric in Jonathan Edwards Sinners in the Hands of an Angry G

On July 8th 1741, Jonathan Edwards preached the sermon â€Å"Sinners in the Hands of an Angry God† in Enfield, Connecticut. Edwards states to his listeners that God does not lack in power, and that people have yet not fallen to destruction because his mercy. God is so forgiving that he gives his people an opportunity to repent and change their ways before it was too late. Edwards urges that the possibility of damnation is immanent. Also that it urgently requires the considerations of the sinner before time runs out. He does not only preach about the ways that make God so omnipotent, but the ways that he is more superior to us. In his sermon, Edwards uses strong, powerful, and influential words to clearly point out his message that we must amend our ways or else destruction invincible. Edwards appeals to the spectators though the various usages of rhetorical devices. This includes diction, imagery, language/tone and syntax. Through the use of these rhetoric devices, Edwardsâ₠¬Ëœs purpose is to remind the speculators that life is given by God and so they must live according to him. This include...