Neon App: Get Paid to Record Calls & Data Sales

A Controversial App and the Rise of AI Data Collection
An innovative application, Neon Mobile, is currently ranked as the No. 2 app within Apple’s U.S. App Store’s Social Networking category. This app offers users financial compensation in exchange for recording their phone calls, with the collected audio data then being sold to companies specializing in artificial intelligence.
How Neon Mobile Operates
Neon Mobile is marketed as a revenue-generating opportunity, promising users “hundreds or even thousands of dollars per year” for providing access to their audio conversations.
The company’s website details a payment structure of 30¢ for each minute of calls made to other Neon users, and a maximum daily earning potential of $30 for calls to any contact. Referral bonuses are also offered.
Data from Appfigures reveals a significant surge in the app’s ranking. It moved from No. 476 in the Social Networking category on September 18th to No. 10 by the end of the following day.
By Wednesday, Neon had ascended to the No. 2 position among the top free social applications available on iPhones.
Furthermore, the app briefly achieved the No. 7 position overall in the App Store, subsequently climbing to No. 6.
Data Capture and Usage
According to Neon’s terms of service, the application is capable of capturing both incoming and outgoing phone calls. However, marketing materials emphasize that only the user’s side of the conversation is recorded, except when communicating with another Neon user.
This collected data is then sold to “AI companies” to facilitate the development, training, and refinement of machine learning models, artificial intelligence tools, and related technologies, as stated in the terms of service.
The emergence and acceptance of such an application within app stores highlights the increasing influence of AI and its encroachment into previously considered private spheres.
The app’s high ranking demonstrates a willingness among a segment of the market to trade personal privacy for modest financial gain, potentially overlooking the broader implications for themselves and society.
Terms of Service and Data Licensing
Despite assurances in its privacy policy, Neon’s terms of service grant the company a remarkably broad license over user data.
This expansive licensing agreement provides Neon with considerable flexibility regarding data usage, potentially exceeding the scope of its stated claims.
The terms also include a detailed section concerning beta features, which are offered without any warranty and may contain various issues or defects.
Legality and Expert Opinions
While raising numerous concerns, Neon’s operation may be technically legal.
Jennifer Daniels, a partner at Blank Rome’s Privacy, Security & Data Protection Group, explained to TechCrunch that recording only one side of a phone call is a strategy to circumvent wiretap laws.
Daniels noted that many state laws require the consent of all parties involved in a conversation for it to be legally recorded, describing Neon’s approach as “an interesting approach.”
Peter Jackson, a cybersecurity and privacy attorney at Greenberg Glusker, concurred, suggesting that the phrasing around “one-sided transcripts” could indicate that Neon records entire calls but removes the other party’s contributions from the final transcript.
Data Anonymization Concerns
Legal experts also expressed concerns regarding the true extent of data anonymization.
Neon claims to remove personally identifiable information such as names, emails, and phone numbers before selling data to AI companies. However, the company does not specify how these AI partners or other purchasers might utilize the remaining data.
Voice data, for example, could be employed to create fraudulent calls mimicking a user’s voice, or AI companies could leverage it to develop synthetic voices.
“Once your voice is over there, it can be used for fraud,” Jackson cautioned. “This company now possesses your phone number and sufficient information – including voice recordings – to potentially impersonate you and commit various fraudulent activities.”
Even assuming the company’s integrity, Neon does not disclose the identities of its trusted partners or the permissible uses of user data by those entities. The app is also vulnerable to potential data breaches, a risk inherent to any organization handling valuable data.
App Functionality and Transparency
During a test conducted by TechCrunch, Neon did not provide any indication that calls were being recorded, nor did it issue any warnings to call recipients.
The app functioned similarly to other voice-over-IP applications, displaying the standard inbound phone number on the caller ID. (Further verification of the app’s claims is recommended for security researchers.)
Company Information and Funding
Alex Kiam, the founder of Neon, did not respond to a request for comment.
Kiam, identified only as “Alex” on the company website, operates Neon from a New York apartment, according to business filings.
A LinkedIn post suggests that Kiam secured funding from Upfront Ventures for his startup a few months ago, but the investor had not responded to an inquiry from TechCrunch at the time of reporting.
Has Artificial Intelligence Diminished User Sensitivity to Privacy?
Previously, organizations aiming to monetize data gathered via mobile applications typically engaged in such practices discreetly.
The disclosure in 2019 that Facebook compensated adolescents for installing a surveillance application sparked significant controversy. Further headlines emerged the subsequent year when it came to light that analytics firms serving app stores operated numerous ostensibly harmless applications to amass usage statistics concerning the mobile application landscape.
Frequent advisories caution against the use of VPN applications, which often do not provide the level of privacy they advertise. Governmental reports also detail the routine procurement of personal data that is openly available for purchase.
Currently, AI-powered agents are routinely integrated into meetings for note-taking purposes, and devices featuring constant AI activation are readily available. However, in these instances, explicit consent for recording is generally obtained, as Daniels explained to TechCrunch.
Considering the pervasive nature of personal data usage and its commercialization, a degree of cynicism has likely developed, leading some to believe that if their data is inevitably being sold, they might as well benefit financially from it.
However, this perspective may be shortsighted, as individuals could be inadvertently disclosing more information than intended and potentially compromising the privacy of others in the process.
“A substantial inclination exists, particularly among professionals – and indeed, the general public – to streamline tasks as much as possible,” Jackson observes. “Certain productivity applications achieve this by sacrificing, quite obviously, your own privacy, but increasingly, also the privacy of those you interact with daily.”
Related Posts

Disney Cease and Desist: Google Faces Copyright Infringement Claim

OpenAI Responds to Google with GPT-5.2 After 'Code Red' Memo

Waymo Baby Delivery: Birth in Self-Driving Car

Google AI Leadership: Promoting Data Center Tech Expert
