Technology Banner

That’s the advice I’ve given my kids when they’re scrolling through TikTok. Any photo, video, or voice clip online could be AI-generated. That’s right, all these video influencers may not be real and they definitely won’t be by this time next year.  The safest mindset is to start with skepticism and then seek verification from another trusted source.

This isn’t just parenting advice - it’s becoming critical guidance for the corporate world.

According to a recent Wall Street Journal article (AI Drives Rise in CEO Impersonator Scams, Aug 18)  AI-powered CEO impersonation scams are on the rise, already responsible for over $200 million in losses in early 2025. Bad actors are leveraging deepfakes that look and sound authentic enough to trick employees into transferring funds or sharing sensitive information.

The lesson is clear:

  • Trust must be verified, not assumed.

  • Corporate safeguards need to evolve. Policies, processes, and technologies must be built with the assumption that any incoming request, video call, or message could be AI-generated.

  • Leaders need to model digital skepticism. Just as I’ve told my kids to question TikTok, executives should foster a culture where employees feel empowered to double-check - even if the request appears to come from the CEO.

At NOV8TIV Advisory, we believe the organizations that thrive will be those that treat verification as a core part of their digital strategy.