According to the Mortgage Bankers Association, the average net cost to originate a mortgage (all loan types) in the fourth quarter 2018 was $8,611. Imagine, only six years earlier in 2013, the cost to originate a loan was closer to $5,000.
There is no single reason for this astronomical increase, but industry economists cite low production volumes, increasing production expenses, intense competition, and a lull in refinance activity as contributing factors. Even as more financial institutions and mortgage banks embrace digital lending and standalone automation tools, cost pressures continue to mount, leaving many business leaders perplexed as to why investments in automation are not making a more significant dent or bringing Cost Per Loan down.
Randy Loghry, Senior Vice President at OpEXNow, a division of Consolidated Analytics, says that he gets this question a lot. His response?
“Cost Per Loan is a good indicator of internal efficiency but is usually an estimate and provides little guidance as to where scalability and bottom-line performance can be specifically attacked on an ongoing basis. Without the specifics, the root problem will continue to plague the lender and staffing levels will stay in lock-step with loan volume changes”.
Moreover, says Loghry, “If you want to control your Cost Per Loan, either on the origination side or on the servicing side, you need to control your processes. If you do not understand (at a granular level) which individual lending activities have the highest impact to your bottom line, you cannot control the cost side of your business. If a lender is simply dividing the total costs by the number of loans processed, they are applying a simple math approach to a much more complex issue.”
Loghry, who has spent his entire career helping lenders pinpoint process and technology inefficiencies in a practical and immediate way, explained that you cannot blame your systems for your Cost Per Loan unless you have done the work to understand how your processes are implemented in those systems. Getting a better understanding will enable lenders to translate process data into operational management data and technology road-mapping data.
“A structured approach lets a lender vet the payback of individual optimization projects and prioritize them accordingly. You can throw a dart and hope for the best, or you can pinpoint bottom-line impact with a plan…and that plan does not have to include spinning up large Lean Six Sigma team or project.”
At OpExNow, Loghry evaluates lender loan origination platforms and processes and provides process automation roadmaps to its clients. “While each project is different,” says Loghry, “I am constantly amazed at the staggering dollar amounts of inefficiencies that these optimization projects uncover.”
When it comes to using Cost Per Loan to make operational improvements, Loghry highlighted three key ways lenders can get a better handle on it:
1. Calculate your Cost Per loan from the bottom up, not the top down. General Cost Per Loan calculations don’t drill down to show staff utilization rates, process inefficiencies, parallel processing opportunities or wait times.
2. Once you have granular Cost Per Loan data, leverage it for improved staffing capacity estimates and ramp-up times. Predictability in resource needs is much improved with better process understanding and can go a long way in reducing the standard, cyclical, overstaff to be safe/RIF process of the industry.
3. Leverage your granular Cost Per Loan data to systematically prioritize and implement automation or process optimization opportunities. A more targeted and systematic Cost Per Loan calculation enables lenders to pinpoint bottom line returns from a specific project and improve overall scalability.
Simply put, lenders can markedly improve their Cost Per Loan and profitability, but it starts with uncovering, understanding and managing hidden component costs. To get a true Cost Per Loan estimate, reach out to Randy Loghry, email@example.com