CFOs’ Toolkit for Adopting AI

Find out interesting insights with Anna Tiomina, CFO, & Founder Blend2Balance

Moderated by Emily, Digital Transformation Consultant at Hyperbots

Don’t want to watch a video? Read the interview transcript below.

Emily: Hello, everyone. This is Emily, a digital transformation consultant at Hyperbot Systems, and on the call, I’m really glad to have Anna with me. Anna is the CFO at Blend2Balance. In today’s discussion, we’ll be talking about a CFO’s preparatory toolkit for the adoption of AI. But before we dive in, Anna, would you share a brief overview of your background and perhaps set the stage for our discussion?

Anna: Sure. I’ve dedicated my entire career to finance, and I’ve been a CFO for more than 10 years. I’ve worked in various companies and industries. I started in steel manufacturing, spent around five years in pharmaceuticals, and joined an IT services company about four years ago. So I have a very versatile background in terms of industries. I also provide strategic consulting for early-stage startups. Since 2022, there’s been a huge emphasis on AI in all areas, including finance. Many organizations struggle to find the right approach to this transformative technology. It’s a pleasure to be here and shed some light on this crucial topic.

Emily: That’s really amazing. Great to have you as well, Anna. Let’s start with the first question. What would you recommend as the initial action for CFOs venturing into AI adoption?

Anna: I don’t recommend jumping into AI implementation initially. It’s worth running an audit in three main areas: data infrastructure, team skills, and the status of existing processes. For data infrastructure, it’s important to evaluate sources, ensure a single source of truth, address discrepancies, and prepare the data before implementing AI tools. Team readiness is paramount. Some teams are flexible with new technology, while others need more preparation to understand how it works. Lastly, the state of existing processes is vital. Are they unified and documented? Automating chaos leads to automated chaos, which is not what we want.

Emily: Completely agree. Those are insightful points, Anna. Moving forward, what key objectives would you recommend CFOs include in their AI strategic roadmap for the finance department?

Anna: When preparing the strategic roadmap for AI implementation, CFOs should focus on quantifiable objectives such as improving accuracy in financial forecasting, reducing processing times, and enhancing compliance and fraud detection. Setting a goal to automate 30% of manual data entry tasks within a year could significantly boost efficiency and accuracy. As a CFO, I’m always looking at the return on investment. AI implementation in finance operations should also consider potential savings and scalability if the organization plans to grow. Additionally, the cost of mistakes in finance operations is significant. AI can minimize errors, prevent fraud, and save the organization money in the long run.

Emily: True and valuable insights indeed. Considering your experience, what challenges should CFOs anticipate when aligning AI initiatives with their overall business strategy?

Anna: From what I’ve seen, security is a top concern among CFOs. Not understanding the technology can make it scary to let it make crucial decisions. Addressing security is crucial to reducing friction and gaining agreement from the rest of the team. I also recommend not rushing implementation. Let stakeholders adjust, understand the technology, and recognize its benefits to avoid big mistakes. In the long run, AI is a great technology. However, there’s pressure from leadership to implement it quickly to stay competitive. Finding the right balance between preparation and implementation and getting a leadership agreement is key.

Emily: Got it. Completely agree. Thank you so much, Anna, for sharing your insights and expertise on these critical aspects of adopting AI in finance. Any final thoughts or key takeaways you’d like to leave with our audience?

Anna: For CFOs feeling a bit lost in this process, I encourage them to do some reading or attend webinars. There’s a lot of information available, and it doesn’t take long to understand how the technology works and its benefits. Don’t be scared. It’s exciting to see changes in this market since finance automation tools haven’t seen a revolution since the 1970s.

Emily: That’s some great advice. Thank you so much, Anna, for being here and speaking on a topic that’s buzzing everywhere. It was truly amazing having you here today.

Anna: My pleasure.

ROI on AI-led Automation Initiatives in Finance

Find out interesting insights with Bimal Shah, CFO Corium & Strategic Advisor

Moderated by Emily, Digital Transformation Consultant at Hyperbots

Don’t want to watch a video? Read the interview transcript below.

Emily: Welcome to the latest installment of our interview series, where we delve into the intersection of finance and technology. Today, we are privileged to host Bimal Shah, an esteemed finance professional with extensive experience in the pharmaceutical industry, including serving as a CFO. Our focus for this session is on understanding the return on investment (ROI) of AI-led automation initiatives in finance. Let’s dive in!

Emily: Hello everyone, and welcome! I’m Emily, a digital transformation consultant at Hyperbots, and I’m thrilled to have Bimal joining us today. Bimal, before we jump into the details, could you please share a bit about your background?

Bimal Shah: Certainly, Emily. Thank you for having me. I’ve spent over a decade in senior financial roles within the life sciences industry, ranging from privately held firms to publicly traded companies. My expertise lies in navigating the complexities of finance in the pharmaceutical sector.

Emily: Thank you, Bimal, for that introduction. Let’s structure our discussion today into three key areas: understanding ROI methods, AI adoption in finance, and challenges and recommendations. Starting with ROI methods, Bimal, as a seasoned CFO, what frameworks have you employed to evaluate ROI?

Bimal: ROI, or return on investment, is paramount in financial decision-making. It can be measured through metrics such as internal rate of return, payback period, or simply as a ratio of investment returns. Assessing ROI involves considering factors like technology costs, implementation expenses, and potential cost savings or efficiency gains.

Emily: Fascinating insights, Bimal. Moving on to AI adoption in finance, which processes do you see as ripe for AI integration?

Bimal: Invoice processing, accounts payable, and accounts receivable management are prime candidates for AI adoption. These areas involve repetitive tasks that can benefit from automation, leading to cost savings and improved accuracy.

Emily: That’s insightful. And how would you prioritize AI adoption within the finance function?

Bimal: I would start with areas like accounts payable and receivable, where the tasks are relatively straightforward but labor-intensive. Demonstrating the benefits of AI in these areas can pave the way for adoption in more complex functions like financial planning and analysis.

Emily: Excellent advice, Bimal. Now, let’s delve into the nitty-gritty of calculating ROI. Could you elaborate on the quantitative and qualitative gains of AI-led automation?

Bimal: Quantitative gains include cost savings from reduced headcount and improved payment processing efficiency. On the qualitative side, benefits such as enhanced decision-making and employee satisfaction are harder to measure but equally valuable.

Emily: That’s a comprehensive overview. Bimal, how would you recommend measuring ROI for automation initiatives, considering both direct and indirect costs?

Bimal: Direct costs, such as technology investments and labor expenses, are relatively straightforward to quantify. However, capturing indirect costs and intangible benefits requires a more holistic approach. It’s essential to focus on measurable metrics while acknowledging qualitative gains.

Emily: Thank you for clarifying that, Bimal. As we near the end of our discussion, how would you suggest CFOs and controllers approach ROI measurement and publication for automation initiatives?

Bimal: I advocate for a balanced approach, emphasizing quantifiable benefits while acknowledging qualitative gains. Attempting to overly quantify intangible benefits may dilute the credibility of ROI calculations. Transparency and clarity are key when communicating the value of automation initiatives.

Emily: Wise counsel, Bimal. Finally, in terms of risk assessment, how do you recommend quantifying potential risks associated with AI implementation?

Bimal: While risks such as damaged relationships or employee concerns are challenging to quantify, they must be acknowledged and managed. Mitigating risks requires proactive communication, stakeholder engagement, and a focus on seamless implementation.

Emily: Thank you, Bimal, for your invaluable insights into maximizing ROI on AI-led automation initiatives in finance. It’s been a pleasure discussing these critical topics with you.

Bimal: Likewise, Emily. Thank you for hosting me, and I look forward to future conversations on the evolving landscape of finance and technology and there you have it, folks! A deep dive into the ROI of AI-led automation initiatives in finance, featuring insights from Bimal Shah, a seasoned CFO. Stay tuned for more enriching discussions on the intersection of finance and technology.

Risk Mitigation Framework of AI in Finance

Find out interesting insights with Cecy Graf, CFO & Strategic Advisor

Moderated by Emily Digital Transformation Consultant at Hyperbots

Don’t want to watch a video? Read the interview transcript below.

Emily: Hello, hi everyone. This is Emily, a digital transformation consultant at Hyperbots. Good morning, good evening, good afternoon, depending on where you are for today’s session on risk mitigation framework for AI adoption and finance. I’m really glad to have Cecy on the call with me, who is an experienced CFO. So thank you so much, Cecy, for being a part of this discussion. Before we dive into it, can you please introduce yourself?

Cecy: Sure, my name is Cecy Graf. I am based in Seattle, Washington, and my experience has been in law firm management for almost the last 20 years.

Emily: Got it, thank you so much for your introduction, Cecy. So for today’s session, I was essentially divided into three portions. The initial section involves examining the various risks linked to the adoption of AI along with strategies for mitigating these risks. In the second part, we’d cover in depth the risk mitigation framework, and in the third segment, we’ll delve into all the financial and accounting sub-processes to determine which one yields the highest return on investment and least associated risks, thereby distinguishing between the most lucrative versus the less profitable ones. So to kick things off, Cecy, as you might be aware, AI has started seeing real adoption in finance and accounting, and it is likely to accelerate. What risks do you see with AI adoption?

Cecy: I think the biggest concern that we have, particularly in the legal industry, is around compliance and legal concerns. Making sure that the information that is being provided actually adheres to all the things that we need to adhere to. There’s more, but those are the biggest concerns for us right now, and they are a real barrier to adoption.

Emily: Compliance and legal concerns. Let’s go deeper into this particular aspect. Please elaborate on the promised AI benefits versus the real returns.

Cecy: It’s really exciting. AI has the potential to completely revolutionize how we do things in the ways that computers did, having a huge impact on how we process financial transactions and provide the information that is gleaned from those transactions. But that’s a classic garbage in, garbage out challenge. You only get high-quality output if you’re putting in high-quality inputs. So we have to be, and this is of course not specific to AI, we have this problem with any of our information delivery. We have to be capturing good information, we have to be providing the right data points so that we can get the best outcome out of our AI tools.

Emily: Yeah, that’s correct. What would your recommendation be to really mitigate this particular risk?

Cecy: Beta clients. At this point, it just makes it so much more important if we’re going to be automating and applying artificial intelligence to our data, that our data is crisp and clean and as perfect as possible. So really having good data governance strategies in place so that you can really leverage the power of AI.

Emily: Got an understanding. Moving on to the next one, do you see security as a concern and what exactly is the risk there?

Cecy: In the legal industry, we have very strict client confidentiality rules and obligations. Any threat to our data in terms of leakage or inappropriate use is a huge concern for us. Cybersecurity in general is a major focus for the industry because we, like every other industry, are under threat. But the risk of AI, and the things that concern people about AI in the legal industry, really come back to how we ensure client confidentiality. How do we protect against leakages? How do we protect against data breaches?

Emily: Do you have any recommendations to alleviate these security-related risks?

Cecy: Again, this isn’t new to AI, but having very strong data protection measures in place and making sure that your certifications, whether ISO certified or SOC 2 certified, are in place. These certifications and the rigor required to obtain them protect your data. The challenge is with open AI tools where people don’t realize the openness of these tools. So, having very strong effective use or appropriate use policies in place, and ensuring everyone in the environment is familiar with those and understands the associated risks, is crucial.

Emily: True, I completely agree on that. So, Cecy, the perception of AI leading to job loss is real. What are your comments on that?

Cecy: I think there’s going to be a shift in jobs, not necessarily a loss of jobs. We had the same concerns when we started implementing computers in the workplace. People thought they were going to lose their jobs, but that didn’t happen. The jobs changed, the duties changed, but jobs didn’t disappear. We’re seeing that now with AI; it’s creating more work for lawyers as we navigate and figure out the necessary structure around this. I’m not as concerned about job loss; it’s more about positioning ourselves for the jobs of the future.

Emily: Got it. Any recommendations on how to change this perception in people?

Cecy: Investing in our people is key. AI is an exciting, innovative tool that can change how we perform and allow us to provide higher and better use. AI can take away a lot of the grunt work, freeing people up to upskill and expand their horizons rather than fearing that AI will dictate how humans function in their roles.

Emily: From a finance and accounting context, do you see that as a real challenge?

Cecy: I don’t see it as much. It’s more about how you have your effective use policies in place and how you utilize and train your people to use those tools. AI is more about informing how people do what they do, providing better data to drive data-driven decisions rather than just following your gut. We’re not robots; people should direct the AI tools.

Emily: I completely agree. What is your suggestion to overcome any challenge, however minuscule?

Cecy: It’s all about culture and training your people, making sure that your team feels invested and engaged in the process. Creating that security within your environment that this isn’t a threat but an opportunity.

Emily: Got it. Revisiting one of the topics we discussed, the risk of AI output being trustworthy, especially in finance, is one of the biggest risks. Can you share some examples?

Cecy: Sure. My biggest fear is getting inaccurate forecasts. There’s human error today, but if we are completely dependent on AI without applying our knowledge and expertise, we can end up managing to a forecast that is completely off the rails, which doesn’t position us to succeed. And compliance reporting is another area of concern. If we are dependent on AI for our compliance reporting without any checks and balances, we risk being out of compliance.

Emily: Can you suggest some methods to reduce this risk?

Cecy: It comes back to data quality and data governance, ensuring high-quality inputs. You need testing and validation, controls around your processes and approvals, human oversight, and transparency. Involving your team and being transparent about how AI is used is crucial.

Emily: Got it. Thank you so much, Cecy, for talking to us about the various risks associated with AI adoption and the strategies for mitigating these risks. It was great speaking to you today.

Cecy: Thank you, Emily. Thanks for having me.

Emily: Welcome back, Cecy. In the last segment, we covered the various risks linked to AI adoption. In this segment, we’ll dive deeper into the risk mitigation framework from a holistic perspective. There are two broad views on AI’s impact on compliance: one view is that AI improves auditability, visibility, transparency, and data-driven decisions, resulting in better compliance. The counter view is that one should take special measures to ensure compliance, especially where AI is involved. What are your views on that?

Cecy: I think both are true. AI has tremendous potential to significantly improve our compliance efforts. We can automate processes, enhance data quality, and improve compliance. But we still need oversight. A balanced approach is essential, recognizing both the potential and the challenges and risks associated with AI.

Emily: Got it. If you had to draw a risk mitigation framework for AI adoption, what critical components would you advise CFOs to include?

Cecy: As a CFO, it’s always about the return on investment. Quantifying the ROI and evaluating the associated risks is critical. This is true not just for AI but for any major decision.

Emily: Would you like to comment on the return on investment or ROI, especially in terms of AI-led transformation in various finance and accounting processes?

Cecy: Sure. Where there is high risk, there is the potential for high return. In terms of ROI, mergers and acquisitions are at the top of the list despite the high risk. The potential ROI of AI in M&A is significant, enhancing due diligence, market analysis, and integration planning, potentially saving millions and creating value through informed decision-making and strategic alignment. Next on my list is financial planning and analysis. AI can significantly improve forecasting, budget optimization, and strategic planning, directly impacting an organization’s financial health and growth trajectory. The ability to make more informed investment decisions offers substantial returns.

Emily: So, you mentioned that expense management is at the lowest risk and mergers and acquisitions at the highest. Can you elaborate a bit more on why that is?

Cecy: Expense management is low risk because it is already highly standardized. The processes are routine and involve less complex decisions, making them more amenable to AI automation. Errors in expense management generally have a limited financial impact compared to errors in other financial processes. There is usually a wealth of historical data available, making it easier for AI systems to learn and make accurate predictions. On the other hand, mergers and acquisitions involve high stakes, complex negotiations, legal considerations, and strategic decisions that require a deep understanding of multiple variables. While AI can greatly assist, the inherent complexity and high risk involved in M&A mean that human oversight and strategic thinking remain critical.

Emily: Alright, any closing comments before we wrap up this session?

Cecy: I would just reiterate that adopting AI is about being informed, cautious, and strategic. It’s about leveraging the technology to enhance your capabilities while understanding and mitigating the risks. Balancing innovation with oversight and continuous learning is key to successful AI integration in finance and accounting.

Emily: Thank you so much, Cecy. It was a pleasure talking to you.

Cecy: Thank you, Emily. It was great speaking with you too.