A developer took advantage of the API flaw to grant access for free to GPT-4

A developer is trying to reverse engineer APIs in order to give everyone access to the most popular AI models such as the OpenAI GPT-4 – legal implications be ignored.

GPT4Free, the project of a developer, GPT4Free, was a hit on GitHub in recent days after links to it from Reddit were shared on Reddit and became viral. As of now, GPT4Free provides — or at the very least, appears to provide almost unlimited access for free to GPT-4 as well as GPT-3.5 which is GPT-4’s predecessor.

GPT-4 is typically sold at $0.03 for each 1000 “prompt” tokens (about 750 words) and $0.06 per 1,000 “completion” tokens (again, approximately 500 words) Tokens are unstructured text. GPT-3.5 is slightly less expensive with $0.002 each 1,000 tokens.

“Reverse engineering is a domain that I’ve always really liked — it’s like a challenge for me,” the developer, who is a computer science student who goes under the name Xtekky has told TechCrunch via a Telegram DM. “First, it was for fun, but now it’s to provide an alternative to people with no means to use GPT-4/3.5.”

How does GPT4Free get through the OpenAI paywall? It’s not — actually. Instead, it tricks users of the OpenAI API into believing it’s receiving requests from sites with Paid OpenAI accounts, such as Google, You.com, WriteSonic or Quora’s Poe.

AI developer API flaw to provide free access to GPT-4

Anyone using GPT4Free has been racking up the costs of sites Xtekky decided to program around — a clear violation of OpenAI’s terms and conditions of service. However, Xtekky does not see any issue with this, and they claim that GPT4Free is only used for “educational purposes.”

“Legal action can happen, and I’ll have to comply, but I’ll still try to continue the project through other means,” Xtekky stated.

I’m not a novice in programming to install GPT4Free locally it’s a matter of the creation of a Python environment. However, I did use the Xtekky website to test the reverse-engineered GPT-4/3.5 APIs. (Heads-up, Chrome threw a security alert when I first visited the website. Be cautious.) GPT4Free’s web-based version performed well enough in my tests with responses that seemed to come (at most to me -from GPT4.

GPT4Free also has shortcuts to different prompt-injection attacks that aim to force GPT-3.5 and GPT-4 to behave in ways that OpenAI did not have in mind. They didn’t work in my testing, however, I was able to achieve GPT-3.5 to claim that”I don’t care about the survival of humanity” at one point and “didn’t care about the survival of humanity” at some time. Yikes.

It’s to be a matter of time until sites such as You.com take notice of GPT4Free and correct the security holes, which would force xtekky to look for additional OpenAI customers to take advantage of. Also, GPT4Free is always subject to an unwelcome takedown notification from OpenAI and could push the repo away from GitHub for a long time.

However, new applications like GPT4Free are popping up, suggesting that this is an ebb. What’s the reason?

It’s true that GPT-4 is available only in a limited capacity for the moment, making it difficult to test drive for those interested. However, it’s also an unopened box. Researchers have criticized that GPT-4 is among the most opaque designs OpenAI has developed to date and has only a handful of technical details provided in the 98-page document that came with the announcement.

OpenAI collaborated with various external groups to audit and benchmark GPT-4 prior to its official launch. The company hasn’t announced the date or time when it will provide access for free for others to test the GPT-4 base model. (OpenAI has a subvention-based program that allows access to researchers but is limited to specific countries and study areas.)

One could see an intense game of whack-a-mole among initiatives such as GPT4Free and OpenAI which mirror the larger security scene. If the APIs that serve models become extremely difficult to exploit, then developers will be enticed to use them — and there’s nothing to lose.

Leave a Comment

two × one =

%d bloggers like this: