To strengthen cybersecurity in FinTech, experts emphasize a layered approach that combines technology and human awareness. Rising threats like phishing, smishing, and fraud demand not just better tools but also vigilant, well-trained employees. Embedding security scans into software development, analyzing diverse data signals, and adopting a “defense in depth” strategy are all critical. Ultimately, staying curious, asking the right questions, and embracing evolving technologies—especially AI—can help organizations stay ahead of cyber risks.
Research available only to clients at this time.
*When vendors’ names or quotes are shared as examples in this document, it is to provide a concrete example of what was on display at the conference or what we heard doing our research, not an evaluation or recommendation. Evaluation and recommendation of these vendors are beyond the scope of this specific research document.
Cybersecurity in healthcare is responsible for protecting the data that represents the life’s story of patients and infrastructure to enable proper care. Managing and securing the plethora of edge devices and the interoperability of all the technologies is an increasing challenge. There are four steps to take to enhance your healthcare cybersecurity: select a framework, leverage depth in defense, automate where possible, and test your environment.
Analysis is only available to clients at this time.
*When vendors’ names or quotes are shared as examples in this document, it is to provide a concrete example of what was on display at the conference or what we heard doing our research, not an evaluation or recommendation. Evaluation and recommendation of these vendors are beyond the scope of this specific research document.
AI is transforming media and entertainment, reshaping workflows and automating the tedious. This shift isn’t new—it’s an evolution already well underway. While concerns about disruption persist, AI is proving to be a powerful tool that enhances efficiency, making content more accessible and refining quality. From metadata enrichment to streamlined production, AI empowers professionals by eliminating the mundane and allowing more focus on creativity. Rather than replacing jobs, it shifts how work gets done, seamlessly integrating with existing processes to unlock new possibilities in storytelling, production, and distribution. The industry is adapting, and AI is at the heart of that transformation.
Target Audience Titles:
Chief Technology Officer, Chief Supply Chain Officer, Chief Digital Officer
Chief Data Officer, Chief Marketing Officer, Chief Content Officer
Head of AI and Machine Learning, Data Scientists
Product Managers, Content Managers, Sound Engineers, Distribution Engineers
Production Technologists, Streaming Platform Developers
Chief Technology Officer, Chief Supply Chain Officer, Chief Digital Officer
Key Takeaways
AI is transforming workflows, automating the tedious, and reshaping media and entertainment.
Efficiency gains let creators focus on storytelling while AI enhances accessibility and searchability.
AI isn’t replacing jobs—it’s evolving how work gets done, integrating seamlessly into production processes.
We took the most frequently asked and most urgent technology questions straight to the Technology experts gathering at NAB Show 2025. This Whisper Report addresses the question regarding how can AI and machine learning transform media and entertainment?
As DeepDub’s Oz Krakowski stated, “We see is a lot of the work that was done before is now done in different.” These changes have already hit the headlines for as Dell Technologies’ Tom Burns pointed out, “Everybody’s all concerned with Gen AI but of course the writers and actors strikes were all about that.” Thus this AI transformation is not new but rather is currently underway. Latakoo’s Jade Kurian offered this perspective. “The question now is how do we do this thoughtfully? How do we do it in a way that we don’t compromise ethics – where we don’t compromise people’s jobs? How do we make it flow back and forth where we take advantage of AI and machine learning to make our lives easier, make our lives better, and make entertainment and media better.” Cinnafilm’s Dom Jackson suggests we take another step back to gain a larger perspective, “There’s a lot of fear around AI and those technologies and in that sense, I see those as part of a continuum of ongoing automation processes that have been going on since the industrial revolution. Everyone’s always scared when something new comes along and then very quickly it becomes normal and it empowers us to work in new and different and usually more efficient ways.” See Figure 1 for the Cornucopia of AI use cases suggested. Let’s explore some of those new, and different, more efficient ways.
One thing is for certain, there is no shortage of opportunity for AI to positively impact the media and entertainment sector. The favorite use case among all users, as Strada’s Michael Cioni put it, “AI is best for our industry as a utility form to do the mundane tasks .. we need AI that automates mundane tasks like color correcting sound noise reduction, video audio noise reduction, face tagging, locations, objects, emotions – those are all the things that no one wants to sit and log footage. AI can log it for us.” Or per Ross’s David Green, “AI is going to make us more efficient and more effective and let us focus on what we love doing which is creating amazing content.” No matter which perspective you prefer – the elimination of the mundane or the freeing up to do the fun parts – AI is here to stay in media workflows. As Dell’s Tom Burns observed, “Machine learning has already transformed media and entertainment in so many invisible ways from security to fixing single pixel defects to all kinds of low-level network functions and automated provisioning.”
When it comes to all that stored media, there are plenty of suggestions of how AI can assist as well. Per SNS’s Alex Hlavaty, “It actually can be an incredibly helpful buddy to sort through the stuff we don’t want to do parse through petabytes worth of information. Help us find assets more quickly and just interact with our data in a much more meaningful way while reducing man hours doing things that are laborious.” . Eon Media’s Greg Morrow observed, “our customers have large libraries of video files that only have a file name or have just a a small title. We really enriched those assets with information to make those assets more usable by identifying people places things emotional sentiment um ethnicity as part of those assets.” Dell Technologies’ Tom Burns denoted, “One of the GEN AI things that is proving to be useful is for companies that have large archives or studios that have that have the rights holders to a lot of content, using AI to extend the metadata and inform them of what they actually have in their archive allows them to make that more searchable and therefore more monetizable on.” Increasing metadata for the purpose of search and the related business cases that come from having truly searchable content is a common theme. As Axle.ai’s Sam Bogoch simply stated, if, “they can’t find it they can’t reuse it.”
In some of the use cases such as sound editing, it has completely transformed the task at hand. As DeepDub’s Oz Krakowski observed, “just like you cannot imagine a graphics designer not using Photoshop it’s unheard of right however 20 years ago this was extremely questionable nowadays thinking of doing voice design and editing dialogues without using AI.” Ross’s David Green offers additional suggestions, “things like camera tracking so instead of having to every single second manually figure out where things are and do manual keying you know markers and those sort of things we can use AI to automatically be able to do those things .. instead of having to flip through a manual.” Dell’s Tom Burns observed, “when you render your VFX AI upsampling has gotten so good now that you can render at 2K and up to 4K and it looks better than if you rendered at 4K in the first place.” Wow! Upsampling rendering better than if it was 4k in the first place – now that is an improvement – by definition. Localization is another area that has been drastically impacted by AI. Today, with AI tools such as Yella Umbrella are, “making content accessible to users that wouldn’t have access to it normally either because there’s no one to localize that content or just because the content that they want to access is not available in the accessibility form that they prefer.” If the form you prefer isn’t about language but more about duration, Magnify’s Ken Ruck shares today one can, “edit automatically (and) create clips automatically.” Clients may recall our coverage of Conference Whispers: NAB Show 2023 when automated shorts were first highlighted.
One of the most treasured advancements is that of workflow automation. As Eon Media’s Greg Morrow stated, “workflow automation to improve the efficiency of a media organization in order for the people in the organization to create higher value content and less of the drudgery work.” In other words, AI can transform Media and Entertainment by enabling all to do more with less. But if you are worried about your job, Cinnafilm’s Dom Jackson assured, “strangely ultimately people always end up having jobs they’re different jobs but people always end up with plenty of work.”
As cyber and physical security continue to merge, proactive, multi-layered strategies are essential to safeguard critical assets in interconnected environments. Secure data practices, including encryption for data in transit and at rest, during compute, and ensure compliance with high security standards. Architectural resilience is crucial, integrating cybersecurity from the outset rather than retrofitting outdated systems. Correlating physical and cyber events provides valuable context. Finaly, digitizing workflows streamlines response efficiency, minimizing the window of vulnerability during attacks.
Target Audience Titles:
Chief Technology Officer, Chief Security Officer
Chief Information and Security Officer, VP of Cybersecurity
Director Cyber Physical Security, Security Analyst
Cybersecurity Engineer, Incident Response Analyst
Key Takeaways
Data must be encrypted at rest, in transit, and during execution.
Cyber Physical security requires a securely designed architecture from the start.
Cyber and physical threats must be correlated.
Only a digitized workflow can respond with the required speed to cyber physical threats.
As with all security, cyber physical security must also be concerned with, “ data security and encryption … that’s data in the device, data in transit, data in rest at the servers, and so all of those things we have the highest level standards and we also meet more advanced requirements, “ Bioconnect’s Edsel Shreve. The solution should be flexible enough to enable any data protection requirements that come into play. Edsel Shreve went on to further explain, “for example you need to do certificate rotation for things like TLS encryption So we can do those things not every customer wants them but those are the things that we’ve actually got in our system for the folks that have those higher level requirements so it really is the combination of how do we make sure that they’re cyber secure sitting on the network and then how do we make sure that they’re physically and the data is secure on the on the readers and devices themselves.” In addition, TBW Advisors LLC recommends confidential computing architectures for protection and privacy during computations. For additional information see Industry Whispers: Public is Private – Confidential Computing in the Cloud.
Taking a 1968 mustang and updating it to 2025 safety standards would be quite the challenge and likely land up with an ugly beast that is neither safe nor resembling of a mustang. Cyber physical security is no different than safety. It must be thought of and integrated from the very beginning. As LVT’s Steve Lindsey explained, “it starts with architecture if we can rethink our architectures and we can start building for cyber security in mind.” The challenge of physical cyber security is that, “for the longest time in the physical security space we’ve been using on premise systems and as we’ve lifted and shifted those into the cloud .. what complicates that is as we’re deploying these systems it’ not just cloud to end User, it’s Cloud to IoT (Internet of Things) device which is going through usually public cellular or satellite infrastructure itself and there’s other things that need to be done to address that” Steve Lindsey.
The real power of cyber physical security is the two areas working together to correlate events. Through correlation, context and a greater understanding is realized. An example shared by Advancis’ Paul Shanks demonstrates this best. “Someone loses their badge and falls out of their pocket and they’re logged into the network from home and their badge is used at the building. Those two events by themselves are benign but we take that together and create a an alert for the operator to look into whether is it a Cyber attack or is it a physical attack.”
As early as 2019 TBW Advisors LLC has been advising clients to automate security responses when possible for the simple fact you must. Ransomware attacks were already taking place within a 35-minute window. In 2025 the cyber physical attack vector also calls for automation or a digitized workflow at the very least. As Advancis’ Paul Shanks communicated, “we can take that and make that workflow digitized so that all they have to do is read click and go. Simple as that.”
To navigate evolving fintech regulations, experts at Fintech Meetup 2025 emphasized three key strategies: staying engaged with the field and regulatory agencies, structuring well architected stable solutions, and leveraging AI or Copilots. Together these proactive approaches help fintech firms stay ahead of regulatory shifts while maintaining security and efficiency.
Target Audience Titles:
Chief Technology Officer, Chief Security Officer, Chief Information and Security Officer, Chief Trust Officer, Chief Compliance Officer, Chief Risk Officer
Head of Product, VP of Product, Chief Marking Officer, Data Protection Officer, Director of Data Protection
Enterprise Architect, Director of Data Governance, Chief Privacy Officer, Head of IT Audit
Key Takeaways
Today’s security breaches are the source of tomorrow’s regulations.
Security cannot be an afterthought; it must be planned from the beginning.
Leverage AI and Copilots that are integrated with your processes to aid employees.
We took the most frequently asked and most urgent technology questions straight to the Fintech experts gathering at Fintech Meetup 2025. This Whisper Report addresses the question regarding how can we ensure compliance with evolving regulations? As Socure’s Matt Thompson shared, “I don’t think it’s enough in this space to be a passive Observer or responsive or reactionary to regulations, there’s a lot of Evolution right now happening.” Figure 1 shares three actions you can take to conquer evolving regulations.
One of the best actions an organization can take to stay on top of regulations is to stay engaged and in touch with the real world. First, real world happenings such as hacks define future regulations. As SecurityMetrics Matt Cowart shared, “QSA (Qualified Security Assessor) is really going to help you understand where you’re sitting at and as they are informed with the evolutions of technology and all the advances that are going on having them connected with real world teams.” Or as Matt Thompson of Socure suggested, “staying engaged with the regulators and the development of the regulations themselves.” If you know what the regulators are working on in draft, you will not be surprised when it becomes law. Keep in mind the reach of the company determines what exact regulators and what specific regulations apply. As OnFido’s Marie Millick shared, “we have a team of subject matter experts that are constantly researching. We also collaborate with the same team that works with interpole around everything around data privacy and identity.”
Many suggest the best way to be prepared for evolving scenarios of all types is to start with a robust and secure foundation. As Onbe’s Tony McGee shared, “our company is fully audited, fully solutioned and architected to protect the data.” This architecture doesn’t act alone but is complimented with strong processes. Tony McGee further explained, “ensuring that we build in the processes to make sure that every step of the way is a compliant one.” Together architecture and processes form a robust foundation. This robust foundation enables Onbe to ensure, “that the consumer understands all the fundamentals of the payout.”
Any clients at this phase should schedule an inquiry to receive guidance. We will set up a plan of inquiries during your journey to give you any guidance we may have or can gather to assist you. The plan should capture milestones including but not limited to strategy reviews, presentation reviews, and even architecture reviews.
Today, we are no longer left with antiquated tools. As Thetaray’s Adam Stuart pointed out, “the traditional rule-based systems you have to know what you’re looking for to build that rule but if you don’t know what you’re looking for and you’re looking for these new patterns and behaviors that people are using you can’t do that with the simple rule base which is why cognitive AI is such an important feature to include.” In other words, in addition to keeping up to date and starting with a solid foundation, the tool itself contributes to identification of potentially troubling patterns. Interface.ai’s Connor Tullilus draws us a picture of what this is like in the real world. “To be able in real time have a co-pilot AI assistant sitting behind the scenes to assist them in the day-to-day operations. One in real time being able to update your policies procedures while (two) being able to (use) the AI assistant hooking up with your current knowledge bases your share.”
*When vendors’ names or quotes are shared as examples in this document, it is to provide a concrete example of what was on display at the conference or what we heard doing our research, not an evaluation or recommendation. Evaluation and recommendation of these vendors are beyond the scope of this specific research document.
The logistics and supply chain sector faces significant challenges with data. Issues include non-existent data, inconsistent formats, manual errors, and lack of historical context. These problems stem from complex processes and resistance to change. Human-machine interaction adds another layer of complexity. Generic AI models struggle due to the unique demands of logistics. Despite these hurdles, there are opportunities for generative AI to enhance efficiency and provide valuable insights. Successful implementation requires accurate, context-rich data and a willingness to transform processes. Embracing AI can lead to improved operations and better decision-making in the logistics industry.
Target Audience Titles:
Chief Supply Chain, Logistics Officer, Procurement, Technology, and Data Officers
Supply Chain, Logistics, Procurement, Technology, BI
and Data Science Directors
ERP Specialist, Supply Chain IT, Data Scientists, BI and related managers
Key Takeaways
Inconsistent, incomplete, and manually entered data hinder AI’s effectiveness.
Poorly structured processes and a reluctance to adopt AI-driven solutions slow innovation.
Onboarding new suppliers and standardizing systems remains difficult.
Generic AI models don’t understand logistics-specific challenges.
We took the most frequently asked and most urgent questions straight to the logistics and supply chain experts in the industry. This Whisper Report addresses the question regarding the biggest challenges using generative AI in supply chain and logistics. The first challenge, however, is not unique to that industry nor is it unique to generative AI. This challenge applies to a all analysis and analytics including all forms of AI – generative or not regardless the size of the models. Put simply, no matter how many ways you state it, when you put garbage data in you will get garbage results.
Given the dominance of a common answer, this raises the question, is the sector of logistics and supply chain in worse shape versus other industries? More specifically, is the data itself within logistics and supply chain the problem and if so, why? Put simply and as depicted in Figure 1, the challenges go far beyond the data. As Don Addington of Cloud 9 Perception put it, “in logistics space there is a level of complexity that is more complex than others.” These complexities come in for the following reasons.
Figure 1. Challenges using Generative AI in logistics
Data doesn’t exist
There is an ideal digital world which is very different from the physical world. As Owen Nicholson from Slamcore pointed out, “If you are not seeing real world deployments with all the gnarly things that go wrong you are only creating idealized models that don’t work in the real world.” Distribution centers are full of human and robot workers as well as machines from multiple manufacturers. Unlike construction, many of these machines are in the same building they entered at the start of their usefulness as brand new machines long before generative AI term existed. Logistics is not the neat and tidy world of fintech transactions.
Data is inconsistent
As Ben Tracy of Vizion pointed out, “(many) skipped a few fundament steps, being useful and being reliable… They don’t monitor data quality, they don’t have consistency amongst data formats, and their systems are not exportable for the data that is inside of them.” Or what data professionals call it- ‘good old fashioned data quality’. To put it in the simplest terms possible, we all learned early in elementary school you need data in the same units to perform any math over the data. You do not add inches and feet together. You cannot add meters and feet together. You don’t speak globally about time without time zones. But perhaps most important, you cannot create data quality nor can you analyze data you haven’t or cannot export.
Data is manual and miss-keyed
If you are wondering how bad that data can be, Dawn Favier of Green Screens provided some hard facts, “its not uncommon to flag 35% of their (customers) data as dirty. Dirty meaning miss-keyed data, something tagged as full truck load when its partial.” Obviously, if one looked at data for a half truck and leveraged for a full truck, the resulting analytics are useless. With 35% of one’s data being dirty, there is work involved before you can even hope for insights.
Data lacks historical context For any AI to be successful, you need massive amounts of data over a very small problem so the mathematics behind the AI can provide useful information. As Atit Shah of Chetu explained, “
Even if you have the right collection of data, you can generate incorrect forecasting. A lot of people do not have a huge history or the history of the records so they go into the gen AI because everyone is doing it but it doesn’t meet their expectation.“ No matter how powerful the technology, all forms of AI need good data. Furthermore, the data must have context to be useful for any advance form of AI including generative AI.
Bad Processes One obvious reason for messy data is the messy, manual, and imprecise or undefined processes it represents. The biggest challenge as Bill Driegert of Flexport shared, is simply, “not slapping it (generative AI) on bad processes. There needs to be a lot of process engineering required to leverage AI.” If process re-engineering and establishing a clean data fabric is your organizations Mt. Everest, TBW Advisors LLC offers a lot of first-hand experience and expertise to teams and executive via inquiry. Any clients at this phase should schedule an inquiry to receive guidance. We will set up a plan of inquiries during your journey to give you any guidance we may have or can gather to assist you. The plan will cover milestones including but not limited to strategy reviews, presentation reviews, and architecture reviews. It is not an area to go through without a guide on your side even if the work is outsourced.
Resistant to change
It is always important to consider the culture of any organization when executing or desire to execute change management. As Erica Frank of Optimal Dynamics put it, “need to take a healthy assessment, how resistant are we to change, how are we going to challenge this from the top down.” As with any change management, executive buy-in with a business objective are critical to success. AI for the sake of AI is always a bad idea.
Perhaps the reason many in this space are resistant to change is the change is constant. As Jason Augustine of WNS put it, “Environment keeps changing every 3-6 months”. Thus discovering opportunities to align and integrate the transformational changes into these already occurring network constant changes is a less tumultuous approach.
Human Machine Interaction
Logistics, like manufacturing and construction, has a lot of machines in the loop. Those machines may or may not be intelligent machines. Thus as Dr. Mario Bjelonic of Rivr.ai shared, “the challenge will come up in terms of how the humans and robots will act as a team together.” Optimizing the total solution over this shared space is the true goal. But as one organization is optimized, what about working between each organization?
is the on boarding suppliers cannot be done by AI”. That’s correct. Bringing each and every machine into the system, or each and every supplier and the complex of array of data that that suppliers managed to coalesce together IS ITSELF NOT standardized thus cannot be automated.
Can’t use Generic Gen AI
As Balaji Guntur of Hoptek pointed out, “Most of the models are very generalized.” “AI is data hungry, and you need to train it on real data. The biggest challenge Generative AI in logistics is that the generative models don’t know what logistics is doing. This is the main challenge,” Aviv Castro, Sensos. In summary, as best put by Nykaj Nair of Sugere, “you need data highly accurate data that is relative to the companies supply chain.”
With all the challenges discussed, it may seem discouraging. It is important to realize the significant opportunity awaits thus easily providing business justification for the work to transform – carefully. As Justin Liu of Alibaba.com put it, “we are continuously adopting AI into our workflow into our latest and greatest features and functionalities to do their business more efficiently.” Rye Akervik of Shipsi believes the value is, “in adding it as a first layer to understand the (customer) issue.” Mick Oliver of Dexory shared, “We don’t see it as a challenge we see it as an opportunity and provide insights based on that data.” Rich Krul of Hoplite observed that the intelligent systems are, “way more efficient, people get their answers a little faster and thinks that is a good thing for the industry.” Most importantly as Georgy Melkonyan of Arnata pointed out, “Shouldn’t fear it (AI) is going to take your job, ai will not replace your job. The people that use ai are going to replace your job.”
*When vendors’ names or quotes are shared as examples in this document, it is to provide a concrete example of what was on display at the conference or what we heard doing our research, not an evaluation or recommendation. Evaluation and recommendation of these vendors are beyond the scope of this specific research document.
To effectively integrate AI into healthcare, focus on three key areas: risk, impact, and value. Achieving a Patient 360 view requires orchestrating various tools. AI is embedded in many healthcare solutions including those for asset location, employee safety, and security. Always have a strategy to integrate AI into workflows. Successful integration depends on strong partnerships and clear communication about AI capabilities and limitations.
Target Audience Titles:
Chief Supply Chain, Logistics Officer, Procurement, Technology, and Data Officers
Supply Chain, Logistics, Procurement, Technology, BI and Data Science Directors
ERP Specialist, Supply Chain IT, Data Scientists, BI and related managers
Key Takeaways
Generic AI models don’t understand logistics-specific challenges.
Inconsistent, incomplete, and manually entered data hinder AI’s effectiveness.
Poorly structured processes and a reluctance to adopt AI-driven solutions slow innovation.
Onboarding new suppliers and standardizing systems remains difficult.
We took the most frequently asked and most urgent questions straight to the logistics and supply chain experts in the industry. This Whisper Report addresses the question regarding the biggest challenges using generative AI in supply chain and logistics. The first challenge, however, is not unique to that industry nor is it unique to generative AI. This challenge applies to a all analysis and analytics including all forms of AI – generative or not regardless the size of the models. Put simply, no matter how many ways you state it, when you put garbage data in you will get garbage results.
Given the dominance of a common answer, this raises the question, is the sector of logistics and supply chain in worse shape versus other industries? More specifically, is the data itself within logistics and supply chain the problem and if so, why? Put simply and as depicted in Figure 1, the challenges go far beyond the data. As Don Addington of Cloud 9 Perception put it, “in logistics space there is a level of complexity that is more complex than others.” These complexities come in for the following reasons.
Data doesn’t exist
There is an ideal digital world which is very different from the physical world. As Owen Nicholson from Slamcore pointed out, “If you are not seeing real world deployments with all the gnarly things that go wrong you are only creating idealized models that don’t work in the real world.” Distribution centers are full of human and robot workers as well as machines from multiple manufacturers. Unlike construction, many of these machines are in the same building they entered at the start of their usefulness as brand new machines long before generative AI term existed. Logistics is not the neat and tidy world of fintech transactions.
Data is inconsistent
As Ben Tracy of Vizion pointed out, “(many) skipped a few fundament steps, being useful and being reliable… They don’t monitor data quality, they don’t have consistency amongst data formats, and their systems are not exportable for the data that is inside of them.” Or what data professionals call it- ‘good old fashioned data quality’. To put it in the simplest terms possible, we all learned early in elementary school you need data in the same units to perform any math over the data. You do not add inches and feet together. You cannot add meters and feet together. You don’t speak globally about time without time zones. But perhaps most important, you cannot create data quality nor can you analyze data you haven’t or cannot export.
Data is manual and miss-keyed
If you are wondering how bad that data can be, Dawn Favier of Green Screens provided some hard facts, “its not uncommon to flag 35% of their (customers) data as dirty. Dirty meaning miss-keyed data, something tagged as full truck load when its partial.” Obviously, if one looked at data for a half truck and leveraged for a full truck, the resulting analytics are useless. With 35% of one’s data being dirty, there is work involved before you can even hope for insights.
Data lacks historical context For any AI to be successful, you need massive amounts of data over a very small problem so the mathematics behind the AI can provide useful information. As Atit Shah of Chetu explained, “
Even if you have the right collection of data, you can generate incorrect forecasting. A lot of people do not have a huge history or the history of the records so they go into the gen AI because everyone is doing it but it doesn’t meet their expectation.“ No matter how powerful the technology, all forms of AI need good data. Furthermore, the data must have context to be useful for any advance form of AI including generative AI.
Bad Processes One obvious reason for messy data is the messy, manual, and imprecise or undefined processes it represents. The biggest challenge as Bill Driegert of Flexport shared, is simply, “not slapping it (generative AI) on bad processes. There needs to be a lot of process engineering required to leverage AI.” If process re-engineering and establishing a clean data fabric is your organizations Mt. Everest, TBW Advisors LLC offers a lot of first-hand experience and expertise to teams and executive via inquiry. Any clients at this phase should schedule an inquiry to receive guidance. We will set up a plan of inquiries during your journey to give you any guidance we may have or can gather to assist you. The plan will cover milestones including but not limited to strategy reviews, presentation reviews, and architecture reviews. It is not an area to go through without a guide on your side even if the work is outsourced.
Resistant to change
It is always important to consider the culture of any organization when executing or desire to execute change management. As Erica Frank of Optimal Dynamics put it, “need to take a healthy assessment, how resistant are we to change, how are we going to challenge this from the top down.” As with any change management, executive buy-in with a business objective are critical to success. AI for the sake of AI is always a bad idea.
Perhaps the reason many in this space are resistant to change is the change is constant. As Jason Augustine of WNS put it, “Environment keeps changing every 3-6 months”. Thus discovering opportunities to align and integrate the transformational changes into these already occurring network constant changes is a less tumultuous approach.
Human Machine Interaction
Logistics, like manufacturing and construction, has a lot of machines in the loop. Those machines may or may not be intelligent machines. Thus as Dr. Mario Bjelonic of Rivr.ai shared, “the challenge will come up in terms of how the humans and robots will act as a team together.” Optimizing the total solution over this shared space is the true goal. But as one organization is optimized, what about working between each organization?
is the on boarding suppliers cannot be done by AI”. That’s correct. Bringing each and every machine into the system, or each and every supplier and the complex of array of data that that suppliers managed to coalesce together IS ITSELF NOT standardized thus cannot be automated.
Can’t use Generic Gen AI
As Balaji Guntur of Hoptek pointed out, “Most of the models are very generalized.” “AI is data hungry, and you need to train it on real data. The biggest challenge Generative AI in logistics is that the generative models don’t know what logistics is doing. This is the main challenge,” Aviv Castro, Sensos. In summary, as best put by Nykaj Nair of Sugere, “you need data highly accurate data that is relative to the companies supply chain.”
With all the challenges discussed, it may seem discouraging. It is important to realize the significant opportunity awaits thus easily providing business justification for the work to transform – carefully. As Justin Liu of Alibaba.com put it, “we are continuously adopting AI into our workflow into our latest and greatest features and functionalities to do their business more efficiently.” Rye Akervik of Shipsi believes the value is, “in adding it as a first layer to understand the (customer) issue.” Mick Oliver of Dexory shared, “We don’t see it as a challenge we see it as an opportunity and provide insights based on that data.” Rich Krul of Hoplite observed that the intelligent systems are, “way more efficient, people get their answers a little faster and thinks that is a good thing for the industry.” Most importantly as Georgy Melkonyan of Arnata pointed out, “Shouldn’t fear it (AI) is going to take your job, ai will not replace your job. The people that use ai are going to replace your job.”
*When vendors’ names or quotes are shared as examples in this document, it is to provide a concrete example of what was on display at the conference or what we heard doing our research, not an evaluation or recommendation. Evaluation and recommendation of these vendors are beyond the scope of this specific research document.
To manage tariff costs in the supply chain, a two-pronged approach is recommended: cleaning up data for better decision-making and optimizing cost parameters. Digital transformation is crucial for navigating tariff challenges. Additionally, avoiding hidden costs, moving on-shore, reducing cycle costs, and leveraging free trade zones can help. Utilizing tools to understand total landed costs and diversifying suppliers and logistics providers are also key strategies.
What is the dominant advice?
We took the most frequently asked and most urgent questions straight to the logistics and supply chain experts in the industry. This Whisper Report addresses the question regarding how to manage tariff costs in one’s supply chain. For any professional* even tangentially involved in anything to do with fulfillment, supply chain, and logistics, it is easy to become panicked at the talk of tariffs. Beyond supply chain and logistics professionals, operations and financial executives are impacted by what is going on as are the technologists and data experts that are required to thrive in such environments. As 4flow’s Adam Poch shared, “You have to have a nimble and agile supply chain to navigate that”. Or as FreightFacts’ Lance Healy put it, “our job is to react, anticipate if we can, but apply technology. “ This suggests a two-pronged approach. Clean up your data so you can optimize costs.
Vizion’s Ben Tracy suggests and offers, “transparent and easy to access data to empower intelligent supply chain decisions. “ Yes, digital transformation is required to successfully navigate this challenge. If you have not done so there is no more time to wait. Many solutions expect the data has been collected for a technology team to clean and provide intelligence over. But logistics data is not transactional data nor does it have a history of being clean and collected like financial data. In fact, logistics and supply chain has some of the messiest data with many suggesting over 30% dirty and useless. Research regarding a large variety of vendors involved in cleaning and digitizing logistics and supply chain can be found in Conference Whispers: Manifest 2025 and Conference Whispers: Smart Retail Tech Expo 2024. This is a significant area of expertise offered to our clients through inquiry privileges.
For those somewhere on the digital transformation maturity scale, the problem regarding managing tariffs costs now boils down to continuing to find ways to transform and manage supply costs. As summary of the 6 actions to manage tariff costs can be found in Figure 1.
Sensos’s CEO shared a story about how they onboarded a customer who was blindsided with hidden costs when products went through Africa without their knowledge. Per TrafficTech’s Hilary Ambro, “work with a customs broker with is vested in and understands your trade lanes as you are moving products so you can minimize those costs.”
An obvious way to reduce costs associated with tariffs is to move on-shore. Hoptek’s Sean Maharai suggests, “working towards on shore, raw materials and ability to manufacture (and assemble) on shore”. Or as Mark Richards at AWI Logistics put it, “People are redesigning their supply chains. Instead of distribution in Canada or Mexico servicing the US, they are bringing the distribution back to the US.”
Any and every place one can reduce costs is valuable in such uncertain times. An exciting solution that can impact your cost per pallet offering next day delivery at ground shipping costs is Aeros. Aeros is a EVTOL (electric, vertical take of and landing) vehicle that appear like a blimp and hovers over urban areas with the goods to deliver, drones and related charging stations with line of sight to deliver and drone operators to operate. Rye Akervik shared that their company, Shipsi is an aggregator of last mile and middle mile networks. Shipsi’s solution is to, “rate shop those networks, find the best partner, the best SLA and manage that customer experience. “ Verity’s Taylor Wilson recommends, “utilizing free trade zones to delay the Tariffs and related payments to improve your cashflow.” Finally, if you are traveling between Canada and USA, there is a new solution coming online Fall of 2025. As Manny Paiva of the Gordie Howe International Bridge shared, “You have a Highway to Highway route connection that will allow transport trucks to get their goods across the border within ~11 seconds!”
If an organization has reached digitization maturity, they can leverage top tools to understand their total landed costs. As Yikun Shao of Alibaba.com shared, they offer solutions with “tools to provide transparency to all of costs related to cross border movement of goods so they can make more informed decisions.” But Alibaba.com doesn’t stop there. They also provide tools to directly enable “you diversity of suppliers as well as logistic providers so you have options available. “At the end of the day, managing costs associated with Tariffs is a subset of managing the total landed costs of any goods.
*When vendors’ names or quotes are shared as examples in this document, it is to provide a concrete example of what was on display at the conference or what we heard doing our research, not an evaluation or recommendation. Evaluation and recommendation of these vendors are beyond the scope of this specific research document.
Business agility is a must during these pandemic times. Business agility requires data-driven decisions. Data-driven decision making requires data agility and the modernization of data management to enable business-led analytics. The most common, successful, and scalable data management modernizations involve data virtualization, IPaaS, and data hub technologies to provide a data layer.
Digital transformation requires business and execution agility. Modern data management solutions frequently provide six types of optimizations. Push down optimizations, remote function execution and functional compensation enable full leveraging of the environment. Automation, such as auto ETL, no hard-code and the ability to leverage autoscaling without significant configuration, enables agility and simplifies even the most complex global hybrid environments.