John Russell, after spending 15 “great” years with ASA, has taken his career in a slightly different direction but still maintains his focus on helping appraisers see all of the possibilities before them.
With that in mind, he started the company ValuSight Consulting.
“I’ve had so many conversations over the years with individuals who didn’t fully understand the opportunities available to them using their existing skills, experience, and interests, and felt really good when I could help guide them toward something new,” Russell told Valuation Review. “As appraisers go through the professional life cycle, I want to help inform what their ‘next’ can look like using my wide-ranging background and understanding of the whole of the profession. I’m also excited to help organizations find growth, something I really enjoyed while working at ASA.”
And with the arrival of artificial intelligence (AI), appraisers, to find that growth Russell speaks of, must maintain a level of due diligence when utilizing AI technology in their work.
We asked Russell what he believes appraisers are/should be more concerned with now as it relates to AI.
“I think the biggest fear appraisers have is replacement and displacement through AI. Other technologies to date have been neutral to beneficial to appraisers and the broader collateral risk community – HP12C’s, AVMs and appraisal software,” he told us. “Because AI is capable of the kind of analytical processes that are core to valuation work, it represents the first technology that could do everything an appraiser does today, with unmatched access to data and processing power.
“The good news is that, today, the technology is not yet capable of fully supplanting the appraiser,” Russell added. “The risk for now is thoughtless adoption of AI into the collateral risk process, including by appraisers. ‘The AI said so’ will never be a defense to errors, but unfortunately some users treat AI outputs as definitive when they’re, at most, highly probabilistic of the right answer.”
As to how detailed an appraiser’s due diligence will have to be when it comes to using ChatGPT/AI, Russell offered the advice of “trust but verify.” Appraisers, he said, always need to ask, whatever model they’re relying on, how it reached its conclusions, including sources of information.
He also reminded appraisers of the need to understand the technology would rather provide a wrong answer than no answer, so if anything about an output feels off, appraisers should trust their instincts and dig deeper.
And appraisers should keep track of their own prompts and asks, while they can’t replicate outputs, they can capture the process by which an output was reached, and it may prove critical when an issue arises.
And as best they can, appraisers should try to retain chats with their AI, which Russell thinks makes a logical addition to a workfile.
“AI is the ultimate people pleaser. It wants to provide what it thinks you want to hear and will try to avoid saying ‘I don’t know’ at all costs,” Russell said. “Current models are informed as much on prior queries and responses to its outputs, so there will be times it outputs something it believes to be useful, even when it’s unsupported or entirely unrelated.”
The former ASA staffer also mentioned common sense being key to appraisers deciding when and how to use/adopt AI tools into their businesses.
When asked about some of those “common sense” practices, Russell first stressed appraisers should know their tools and how they’ll use their inputs, citing good AI tools will provide options to make sure appraisers can turn off sharing of confidential or proprietary information.
The last thing he pointed out that appraisers would even want, is for information they have shared with an AI to turn up in an answer somewhere else.
“Also, understand the limits of your tools. Some will tell you the ‘as of’ date in terms of the underlying data used to train the models, so be mindful if you’re using a model significantly beyond that date,” Russell shared. “As I said before, if anything about an output feels off, do some sleuthing. Ask the model what sources it used or how it reached its conclusions, and research these sources yourself.
“Lastly, understand that you, the appraiser, carry all the risk of relying on AI outputs. Don’t risk your career on unedited, unchecked AI outputs, always review for clarity, accuracy, and syntax. You can also find overused AI words online, which can be a red flag for some clients, those you’ll want to replace,” he added.
Russell also mentioned something pertaining to AI, and how it will “overpromise and underdeliver” for a long time to come.
He elaborated on that point by saying everyone is so focused on what AI will do, that individuals sometimes believe the capacity exists in tools to do much of an appraisal report. At best, Russell suggested, AI tools are great writer’s aids, either for proofing existing work or providing a first pass at something like a neighborhood description.
“But we’re a way off from it reaching into the mathematical corners of an assignment, and it won’t have the learned judgment of a professional who consistently serves a community,” he said. “In short, the worst thing you can do is ask AI for something it’s not yet capable of, and then using the output. Take a conservative approach to usage, and you’ll avoid costly mistakes.”
Currently, there is a lot of “hype” within the valuation profession surrounding AI tech becoming a matter of “rushing” on the appraisers’ part, meaning, they know AI saves time, but that may produce more questionable results when it comes to data associated with the valuation of properties.
Russell had thoughts on this, as well.
“I think appraisers generally tend on the risk-adverse side, owing to how much risk they navigate on a daily basis from errors, omissions, reconsiderations of value, bias, and the like,” he said. “So, I don’t see appraisers buying into hype so hard and fast that it creates endemic problems with underlying values or the support involved.
“Where I think the hype is more pronounced is everyone surrounding the appraiser – from AMCs seeing AI as a faster, inexpensive review alternative, to lenders and secondary market participants using it to scan for problematic words or phrases,” Russell added. “The allure of a technology that can work quickly and cheaply at scale is attractive to any business, but I do wonder how robust the cross-checking is to avoid false positives for issues that get sent back to the appraiser or, worse yet, a state board. My hope is that there’s still human intervention in these uses.”
The industry veteran and business owner also outlined his three specific steps (processes) when considering how to adopt new technology.
The first being appraisers needing to have some level of understanding in terms of how AI operates generally, and the preferred tool chosen.
Appraisers, Russell emphasized, wouldn’t rely on software they didn’t know how to use, and AI is no different.
“The second step is that instinct or gut feeling test of whether to trust the output. Personally, I can trust, but only after I verify. Asking the AI how it reached its conclusions is critical to avoiding reliance on faulty outputs,” he said. “Third, ask whether you should be using AI for something you’ll be personally liable for if an issue arises. If you wouldn’t trust another person to perform something in your assignment or report without your involvement, AI should be treated similarly. Just because you can use AI doesn’t mean that you should.”
There is also the view state boards take when it comes to appraisers using AI tools to collect the necessary data when rendering an opinion of value. Might there be a perception of doubt, or a degree of skepticism from state boards when looking at appraisal reports utilizing AI tools?
Honestly, Russell doesn’t think there’s enough consistent transparency around the use of AI for anyone to know definitively whether AI was used and created an issue.
“An investigation would uncover the issue, but unless the appraiser is putting something specific into the certification, it may not be obvious to boards,” he said. “I could see boards taking a harsh view of mistakes owing to AI use that was less than prudent – if for no other reason than to communicate to any appraiser in the jurisdiction that they need to be conscientious about AI use in assignments and reports.”
And as to a reason appraisers might “think twice” about using AI tech to perform specific valuation assignments, Russell expects the more complexity involved in an assignment, the weaker the use case for AI.
“The more we deviate from norms with a technology the relies on data and prior interactions to reach an expected outcome, the more likely it is that any output wouldn’t be useful,” he said. “This goes back to the third leg of the AI adoption stool – should I be using AI on this specific assignment.”
Finally, we brought up the USPAP compliant concern when using AI. And while Russell admitted to not being a USPAP expert and said he should probably “sidestep” the question, he did want to express one last thought.
“I know I come off as skeptical of AI and its potential, and as someone whose ability to communicate is central to my professional worth I struggle with how to balance adoption without losing the voice and perspective that I’ve built over the years. That said, we all need to get our arms around AI in all its forms, writing, photos, video, audio, because it’s seeping into every corner of our lives,” he told us.
“Will AI make work easier? Absolutely,” Russell added. “The point here is not to avoid using AI, but to be clear headed about how to avoid uses that create more problems than solutions. The last thing any of us wants is to be on the wrong side of a complaint or lawsuit simply for trying to save a few minutes using the tools available.”