By Aryne + Dulcinea, July 10, 2018
By Aryne + Dulcinea, July 10, 2018
It’s hard to believe that 2008 was 10 years ago. Portland home values have risen dramatically since then and our real estate market is robust. In the mortgage industry, we are still dealing with the fallout from the bubble of the early 2000s, but the recent wave of new financial technology is showcasing the impact of automation and big data on mortgage underwriting
When default rates rose, investors (and Fannie Mae) rightly reexamined loans in their portfolios for fraud and errors. If they could document fraud or underwriting mistakes, they could force the lender to buy the mortgage loan back and avoid taking a loss themselves. Fraud, such as falsified income or asset documentation, was rare, but underwriting errors were more common. The error could be inaccurately calculated qualifying income, or something as simple as a mistyped street address on the appraisal. In difficult times, investors were looking for any reason to avoid a potential loss. This led to significant losses for lenders and structural changes in how loans were underwritten.
If you took out a mortgage in 2010, you know the result – incredible scrutiny and conservatism. As lenders sought to eliminate errors and outright fraud, multiple levels of re-checking were introduced. Underwriters were verifying the source of $100 deposits, asking for multiple explanation letters – this often led to long loan processing times and missed closing dates. For the lenders, it led to dramatically higher costs. The extra people required to meet new regulatory requirements and avoid buybacks increased costs of production by up to 50%.
Income, assets and credit are relatively easy to quantify, and fewer of the buybacks were related to problems in that area. Appraisals involve more judgment on the part of the appraiser and are more of a gray area. Lenders have less control over the appraisal process and were naturally concerned about exposure in another downturn. Fannie Mae and other investors fear overvaluations and errors that could lead to more losses in another downturn.
Beginning in 2010, FNMA introduced a uniform data set for appraisals, allowing them to build a super database of property sales. From here, we began to see the the impact of automation and big data on mortgage underwriting processes. Every Fannie Mae loan’s appraisal becomes a part of this data set. Their automated system, Collateral Underwriter, is an audit checker that reconciles individual appraisals to the data set and checks it for validity and potential errors. Once the appraisal has been through this filter successfully, the lender is no longer responsible for the appraisal’s accuracy. In essence, FNMA is validating the appraisal to reduce their own risk, and the lender gets relief from future liability.
Beginning in 2016, Fannie Mae introduced a system for direct verification of income and assets. Rather than getting the income and asset information from the borrower and then validating it, FNMA now offers a system of automated income and asset validation. The borrowers’ employment and asset information are input and through Fannie’s automated underwriting process, employment and assets are verified directly with the employer and bank. This is the basis of Rocket MortgageTMand other automated systems. Currently, the system will only work on a small percentage of mortgage applications. Not all employers subscribe to the employment database FNMA uses and not all banks offer direct verification. There’s also some consumer resistance (due to privacy issues) to allowing lenders to gather information from buyers’ employers and banks directly.
For now, most mortgages are closed with direct, personal involvement of underwriters, document drawers and quality control personnel. While limited data access has prevented a sea change in the mortgage process, most lenders utilize automated validation systems to reduce fraud, errors, cost and expense. These processes are mostly invisible to borrowers, but do contribute to a smoother, quicker loan process. Will we see a time when all data is aggregated electronically and mortgages close in two weeks? Probably not – the history of the mortgage business is one of pendular changes between efficiency and losses. We’re currently swinging from triple checking everything to making the process easier and cheaper. We won’t know if this is the right path until the next crash – but the impact of automation and big data on mortgage underwriting underscore the importance of transparency. For now, look forward to a smoother, quicker, more reliable process, but keep your paystubs handy, just in case.
Aaron Nawrocki has over 20 years of direct experience overseeing mortgage and loan processes, working to provide clients the market insight and lending expertise required to make informed decisions.