As artificial intelligence (AI) recently became a trending topic due to the capabilities displayed by ChatGPT v4, a project claiming to be an “AI-based” decentralized application has taken almost $1 million from its users in a suspected scam. 

Blockchain security platform CertiK has recently confirmed that Harvest Keeper has stolen around $933,000 of users’ assets at the time of writing. In addition, users have also lost around $219,000 from ice phishing transactions across the Ethereum, BNB Smart Chain and Polygon networks according to CertiK. The security firm urged users to revoke the permissions they gave the project and warned people to stop interacting with its website.

Harvest Keeper claimed to be an AI project that “optimizes the trading process for maximum payout” and promised a 4.81% return on user deposits. On its website, the platform promised a 101% return on investment within 21 days and an 8% referral reward. The project has almost 30,000 followers on Twitter and more than 32,000 followers on its Telegram channel. 

Cointelegraph reached out to Harvest Keeper for comments but did not get a response.

Related: BingChatGPT ‘pump and dump’ tokens emerging by the dozen: PeckShield

Meanwhile, as the ChatGPT hype resurfaced on Twitter, dozens of accounts claiming to be related to “CryptoGPT” has emerged on the social platform. On March 10, a hashtag related to a token project called “CryptoGPT” has gone trending on Twitter. With it, a number of similar accounts have emerged, with some advertising fake giveaways. Dozens of Twitter accounts with a similar name have also plagued the social platform with some offering giveaways and airdrops that are suspected to be fake.

As the newest version of ChatGPT showed that it could audit smart contracts on Ethereum, many speculated on whether it could replace developers eventually. However, at the recent ETHDubai event, blockchain developers expressed that they are confident that the new iteration of the popular AI tool will not replace developers but will help them instead.