Ways to convert AI-fearers to AI-believers

In my journey to start a successful b2b business that aids companies in their procedures, workflows and administration, the mayority of people I talk to seem to have some sort of fear for AI. (The people that call the shots for a company don’t really have the background in IT/AI or simply the time to follow this fast paced new tech).

I am almost certain this is because of the lack of knowledge and understanding of today’s AI.

Anybody succeeded in reassuring those fearers without talking to much about the technical part?
It seems that I am only able to explain with a lot of details which causes those fearers to lose sight of the main point.

1 Like

I’m afraid that’s an almost impossible task at this stage.

Media is not covering this topic enough - by far. People are mostly left in the dark. They may have heard about Chat GPT and even tried it, but that’s about it.

The underlying problem of this fear is the “not knowing it / not understanding it” You can try to jolt down a simplification, but if someone is afraid, they will start asking a lot of questions, especially if they have to put trust into a product, obviously.

Thing is; AI is pretty unpredictable in many cases. Unless you can actually guarantee that whatever solution you’re offering will not make mistakes, this will backfire when they get to that question - and they usually do.

Try leading with this, bring that up as soon as possible. And think about a way to convince someone to invest into something that makes mistakes (if it does) or show them in a simplified way, that it can’t make mistakes (if that’s the case)

Fear is a very normal reaction to the unknown, especially if it does things you couldn’t have imagined. And potentially could provoke layoffs. That’s not an easy decision for a business owner or manager.

You seem to have specific experiences with this, what’s the main problem? Can you give an example?

Hey Matz

Thanks for the reply.

I am currently trying to set up an application that could eliminate hours of paperwork (that if done manually could even be declined after submission). It’s about training and development for employee’s. When an employee choses a course that registered by the government, the company even gets compensated for the hours of absence of said employee.

The people I talk to now are mostly HR managers. I not talking to much about the application just yet but these are more like interviews as a contractor for them. Mostly done when they expect it to get busy or when a HR professional is abscent for a long time. (There are a lot of female HR professionals and some choose to have childeren. That’s at least a few months off of work).

I’m hoping to obtain such a project next week and slowly approach my way towards the AI application. My guess is that after some time working there, trust will improve and i’ll have a good sense of what im talking about for that specific company.

Ending with a follow up on your reply. What would you feel as a manager if AI could have you lay off people? Excited? Sad? More budget to spend on other things than people?

1 Like

Hehe, I’m the wrong manager for this question, as that’s already reality for me and I’m the one pushing just that. I can’t imagine not knowing what I know, but I to have friends who struggle with it.

They are afraid from me sharing what’s happening, frequently asking me things they are confused by, or afraid of. They have even started posting a bit weird fake AI news. They can’t differentiate because they lack the knowledge (which might be a interesting point for you right there → The unknown is what sparks the fear, as mentioned before)

To answer your question as good as possible:
If something threatens my employees, it has to offer me a huge value to even consider it. They are not just expenses that lead to income to me, but people.

I can tell you what I do when I don’t know or understand what someone’s pitching to me: I’ll start asking questions, and if I’m not convinced, I’ll ask for references or a demo. Sophie couldn’t provide any references, btw :stuck_out_tongue:

In the end that’s your challenge, as it always is: Convincing your potential customers that you bring dependable value to them, without hurting them along the way (or not to badly). This is at least something that hasn’t changed (yet).

It might help to prepare a breakdown, with some visualizations, to hand out before you have a meeting, maybe make them available on your site. I have obviously checked it out, there’s room for improvement. Feel free to DM me if you want honest feedback on it, not going to do that here in the open.

1 Like

I experience the same: Fear, distrust and even laugh.

The problem is that most people dont have enough informations about topic and feel threat. I have a good experience from companies which have not much staff for customer support and need to re-locate these people on other tasks.

In my opinion its still in early phase, but time will come, just keep on track…


Sorry about how this statement might sound redundantly circular or pointing out the obvious, but I really do think it’s this simple: Until it becomes the norm that companies rely on AI tools as main components of their business structure I think this will be a subject of great skepticism, fear, and lack of awareness and understanding to the majority. And whether it becomes the norm has everything to do with a wide variety of influences, most of which are out of the average individual’s ability to significantly affect. It’s just like other paradigm shifts that societies go through: being an early adopter of something new means unavoidably running into misunderstanding, fear, overwhelm, and avoidance. People tend not to like being told that they need/should want something/do something/learn something different and new that they don’t understand. Us humans are usually much more receptive to things we can already relate to in some way.

So, if I wanted to be able to communicate about AI tech and encourage someone to learn more and consider adopting it, I’d first do my best to try to understand their perspective before I give them my thoughts or suggestions.

I’ve found time and time again that most people I’m around IRL have little to no interest in anything about a lot of subjects I’m interested in, and the only times that they’re receptive to me talking about something like AI is if it comes up as related to something they are interested in or already aware of. Making connections to more familiar things is a very helpful skill to get people more interested in learning.


This sounds like an opportunity to practice what the best educators can do: Finding simpler ways to explain complicated subjects. Sort of the ELI5 approach, but tailored for whatever specific setting.

1 Like

There was a time when people would not eat potatoes because they believed they were toxic.