Engineer.ai says its “human-assisted AI” allows anyone to create a mobile app by clicking through a menu on its website. Users can then choose existing apps similar to their idea, such as Uber’s or Facebook’s. Then Engineer.ai creates the app largely automatically, it says, making the process cheaper and quicker than conventional app development.
“We’ve built software and an AI called Natasha that allows anyone to build custom software like ordering pizza,” Engineer.ai founder Sachin Dev Duggal said in an onstage interview in India last year. Since much of the code underpinning popular apps is similar, the company’s “human-assisted AI” can help assemble new ones automatically, he said. Roughly 82% of an app the company had recently developed “was built autonomously, in the first hour” by Engineer.ai’s technology, Mr. Duggal said at the time.
Documents reviewed by The Wall Street Journal and several people familiar with the company’s operations, including current and former staff, suggest Engineer.ai doesn’t use AI to assemble code for apps as it claims. They indicated that the company relies on human engineers in India and elsewhere to do most of that work, and that its AI claims are inflated even in light of the fake-it-till-you-make-it mentality common among tech startups.
Note the article is from 2019. "Engineer.ai" is the same company as "Builder.ai".
To summarise, my reading of the article is that the founder of Bulder.ai (at the time "Engineer.ai") promoted the company's technology as mostly AI, assisted by a few humans; and that WSJ saw documents suggesting otherwise.
dang, why do you say the claim is "lurid"? Is it because of the racist undertones of "Indians not AI"? That's fair, there's severe racism against Indian coders in the West, but scams and fraud absolutely happen and it is inevitable to be disgusted when they are revealed. There has to be a more balance stance than dismissing all fraud claims as "lurid".
I called it lurid because it's sensational and yes, because of the implicit slur.
(Not the most precise use of the word lurid because it lacks the ghastly/uncanny quality - https://www.etymonline.com/search?q=lurid, but I couldn't think of a better one.)
Yash Mittal, ex-Associate Director of Product @ Builder.ai
He describes a sophisticated AI-automated development pipeline, including:
- Requirement gathering via conversational AI
- Auto-generated features, user stories and test cases
- Graph neural nets building prototypes
- AI-based design and code generation
- Quality checks were handled by humans, not deception.
- Human developers never pretended to be AI.
---
Please, let's keep the information here reliable. We don't want this forum turning into another Reddit. Thanks to everyone who took the time to investigate and share evidence or real experiences.
Here we go again. We're amplifying accusations from pseudo blogs with little credibility, and spreading rumours in a forum where most of us are AI/ML engineers, researchers, founders, and university professors. We should know better.
I read their site and blog, and they have a lot of screenshots of their internal apps. It can't be fake! One of them shows templates of well known sites. Based on what I read, you choose a template, Natasha or something else handles the assembly, which I assume is just a fancy way of saying it checks out a repo and installs dependencies. Then the Indian programmers do the rest. This is clearly explained on their website.
Guys, take a look at their blog or website. There's plenty of information about their apps, which were reviewed by Microsoft before they invested 250 million.
And here's the project progress dashboard showing how long it takes to build. In this case, it's 7 months. Clearly, there’s no GenAI involved if it takes that long:
Indian programmers and mathematicians are incredibly talented. In fact, an Indian programmer invented the first Transformer, which led to the rise of GenAI. The world chess champion is also from India. So let's stop mocking them. Companies like Google, Microsoft, Apple, Infosys and BuilderAI employ thousands of Indian programmers who are in the top one percent.
The founder of BuilderAI is also from India and was named Entrepreneur of the Year by Ernst & Young in 2024. He hired directors from Microsoft and Amazon. The Head of AI was a former AI Director at Amazon.
You need to stick to the facts. Their website looks legitimate and makes no mention of GenAI.
The founder was named UK Entrepreneur of the Year in 2024 by Ernst & Young, a company that also has a strong focus on AI research (some of my students work there). They do proper research before giving out an award.
Going back to your question:
You're quoting articles from 2019. Do you realise how quickly technology and startups evolve in a single year, let alone 6?
We can't assume nothing has changed since then, especially when there's plenty of evidence that the company grew 10x after the 2019 article was published.
Check on LinkedIn (if you have premium), the company went from 200 employees to 1000, and brought in directors from Microsoft and Amazon, people who are well known in the AI/ML space for their contributions to AI and virtual assistant development.
The biggest mistake the company made, without question, was the lack of transparency around sales figures. That's on the CEO and CFO, who committed fraud.
But now it feels like we're shifting the blame toward the AI engineers, the people who worked hard to build the internal tools the company was promoting. From what I've seen, those engineers built some great apps. If the CFO and the accountants were cooking the books, that's not their fault.
We should support the engineers and hold the founders and accountants responsible, instead of letting a blogger spread misinformation and claim that former directors from Amazon and Microsoft knew nothing about AI and faked the tech, which is clearly fake news.
>> You're quoting articles from 2019. Do you realise how quickly technology and startups evolve in a single year, let alone 6?
So that I don't misunderstand what you mean, do you mean that Builder.ai evolved into using less AI than what they were using in 2019?
That is, do you mean that in 2019 roughly 82% of an app the company had recently developed “was built autonomously, in the first hour” by Engineer.ai’s technology, according to the quote above, and during the next 6 years the company evolved so that less of their apps' code was built by AI?
Why would you come to that conclusion? Its really not that hard to understand.
They already had a library of templates, modules, components, and existing code, which they were likely reusing.
According to their website, their virtual assistant interacts with customers to understand requirements, recommends suitable templates, and assembles a basic version of the app automatically. Then assigns developers to that project, as shown on their site. In one of their blog posts, they mentioned using machine learning to generate dependency graphs, which was used to map out what needed to be built and estimate timelines.
Since 2019, they've expanded their AI and ML team and hired the former Director of AI from Amazon.
The issue here is that you're quoting articles published before GenAI or ChatGPT even existed. Back then, AI mostly referred to machine learning, it was a different landscape entirely.
I'm only interested in understanding why they lied about their finances. Because the tech they had was actually quite impressive.
We'll never know if they were reusing 40, 60, or 80 percent of the code. What we do know is that developers were spending 7 months (based on a screenshot from their website) writing code. So GenAI clearly wasn't used to generate the code. But to be fair, they never claimed it was.
So you mean that Builder.ai is using more AI now than they used in 2019? In other words, more than "roughly 82%" of apps the company now develops is built "autonomously, in the first hour by Engineer.ai’s technology"?
Note that generative AI and LLMS like GPT, Elmo and BERT existed earlier than 2019 and were the subject of much research (see "BERTology").
I'm not sure what you're referring to. I don't think you understand the difference between automation, traditional AI, and generative AI.
Back in 2019, when someone said part of an app was built "autonomously", it usually meant they reused components and generated the glue logic, configurations, or some custom code around it. In BuilderAI's case, they said they were using AI/ML to create dependency graphs. Just read their blog, it's all there!
The Transformer paper came out in 2017, and the first time I heard of GPT was in 2019. At that point, only a few companies like Google and OpenAI were working on LLMs. If you're expecting BuilderAi, a small startup from India with limited funding, to compete with multibillion dollar companies in 2019, then you're being delusional.
I don’t think I can be of any further help. Apologies.
Engineer.ai says its “human-assisted AI” allows anyone to create a mobile app by clicking through a menu on its website. Users can then choose existing apps similar to their idea, such as Uber’s or Facebook’s. Then Engineer.ai creates the app largely automatically, it says, making the process cheaper and quicker than conventional app development.
“We’ve built software and an AI called Natasha that allows anyone to build custom software like ordering pizza,” Engineer.ai founder Sachin Dev Duggal said in an onstage interview in India last year. Since much of the code underpinning popular apps is similar, the company’s “human-assisted AI” can help assemble new ones automatically, he said. Roughly 82% of an app the company had recently developed “was built autonomously, in the first hour” by Engineer.ai’s technology, Mr. Duggal said at the time.
Documents reviewed by The Wall Street Journal and several people familiar with the company’s operations, including current and former staff, suggest Engineer.ai doesn’t use AI to assemble code for apps as it claims. They indicated that the company relies on human engineers in India and elsewhere to do most of that work, and that its AI claims are inflated even in light of the fake-it-till-you-make-it mentality common among tech startups.
Original link (by nikcub):
https://www.wsj.com/articles/ai-startup-boom-raises-question...
Arxiv:
https://archive.ph/R3nMZ
Note the article is from 2019. "Engineer.ai" is the same company as "Builder.ai".
To summarise, my reading of the article is that the founder of Bulder.ai (at the time "Engineer.ai") promoted the company's technology as mostly AI, assisted by a few humans; and that WSJ saw documents suggesting otherwise.
dang, why do you say the claim is "lurid"? Is it because of the racist undertones of "Indians not AI"? That's fair, there's severe racism against Indian coders in the West, but scams and fraud absolutely happen and it is inevitable to be disgusted when they are revealed. There has to be a more balance stance than dismissing all fraud claims as "lurid".