The evolution of evaluating and managing credit risk
Jul 01, 2015 Walt Wojciechowski
Credit risk management is an important practice that lenders should partake in, as it can assist in eliminating potential issues before they even occur. However, the economic recession has taken its toll on creditors, leading to widespread changes within the sector and newfound lending challenges.
Credit risk management practices have evolved in recent years
A study by The Joint Forum, entitled "Developments in credit risk management across sectors: Current practices and recommendations," noted that the methods that firms use to evaluate and manage their credit risk have been significantly altered over the past decade or so, due in large part to the financial crisis that began in 2007. The source found that supervisors have increased their focus on stress tests and have attempted to develop more sophisticated analytic models.
The Joint Forum reported that firms have seen a number of advancements in credit risk management processes, including changes to the reporting of exposures, enhancements to limit frameworks, regular diligence around monitoring limits and escalation of breaches and improvement in systems to enable more detailed reporting.
Traditional evaluations are ignoring key demographics
An ID Analytics study, entitled "Millennials: High risk or untapped opportunity?" found that young adults are often turned down for loans at a rate that is much higher now than in the past. This trend is not necessarily because millennials are high-risk consumers, however. Rather, the source proposed that this demographic is widely underserved because lenders evaluate their potential credit risk improperly, focusing too heavily on traditional scores. These ratings overcompensate for certain things that many millennials simply do not have, such as mortgage loans, auto loans and credit cards.
One of the big differences between this generation and its predecessors is its aversion to traditional lifestyles that highly value home and car ownership, which have contributed to building good credit scores. Thus, ID Analytics argued, it is not fair to assess these individuals based on their parents' tendencies. Since most members of the demographic have lower credit scores than previous generations due to their varied social conventions, the source said they are considered to be at higher risk than others.
To combat this sentiment, ID Analytics compared millennials with generation Xers and baby boomers who all had similar credit scores. The findings painted a more positive portrait of young borrowers: Only 1 percent of them were more than 12 months late in making a payment, compared to marginally higher rates of 2 percent for generation X members and 3 percent of baby boomers.
This research shows that millennials, as a whole, should not be deemed a high-risk generation simply due to their collective low credit scored. Lenders have been missing out on increased profits by disregarding this untapped demographic, which values different achievements and does place great emphasis on home and car ownership or having credit cards. Despite these nontraditional characteristics, they are a potentially lucrative market for lenders to tap into.
The Joint Forum noted that this is a development that might not be far off. Already, supervisors are beginning to discuss abandoning their reliance on external credit rating agencies when determining risk. Instead, many plan to emphasize internal qualitative analysis in order to best assess each applicant's potential. Such a transformation in standard operations could be greatly beneficial to millennials - and others with poor credit ratings - and might open up new doors for credit building.
The lending world is changing, and perhaps the most prominent reason for these changes is the generational shift away from traditional goals and planning. As more advanced analytic systems are developed, lenders will be provided with better insights into the actual credit risk of each individual.