Audio-based social app Clubhouse has allegedly suffered a data breach, as a third-party developer designed an open-source app that allowed Android smartphone users to access the invite-only, iPhone-only service, SiliconANGLE has reported. The app, which launched in March 2020, has quickly gained popularity, raising $100 million in funding in January.
According to SiliconANGLE, a programmer in mainland China designed and made available open-source code on GitHub, owned by Microsoft Corp. since 2018. The developer said the app was designed to allow anyone to listen to audio on Clubhouse without an invite code, with access to various personal sessions.
John Furrier, founder and chief executive officer of SiliconANGLE Media Inc. who has been digging into Clubhouse and noticed the leak of chats, explained that one of the alleged hacks involves bricking an iPhone, reverse-engineering the Clubhouse application and then using a bot’s “malicious code” to access the various streams and shares them. “Then the program calls the Agora backend as it traverses the room IDs,” Furrier says. “If Clubhouse bans the bot, another iPhone takes its place.”
This is not the first time security concerns have surrounded Clubhouse. Security leaders have criticized the app for launching without much regard for privacy.
Burak Agca, Engineer at Lookout, a San Francisco, Calif.-based provider of mobile security solutions, says, “Clubhouse wants to bring communities together by enabling individuals to discuss common interests and learn more about new topics. The trouble is that the audio data is built on a Chinese-based platform, which means some of that data is sent back to China.”
Agca explains, “It’s alarming that platforms like this are built on leveraging coarse data transfer practices that users accept when they install these apps. Consumers trust their mobile devices and the apps on them to be inherently secure. This may lead them to open up their devices to unknown communications with data collection and traffic management systems. There were similar issues with TikTok communicating with Chinese IPs in 2020. The parent company, ByteDance, said it didn’t share any user data with the Chinese government. In the case of both TikTok and Clubhouse, we all know that if the Chinese government really wants something, they’ll get it.
“In this case the developers have disabled App Transport Security by default for this app, which means unsecured traffic and weak encryption standards may be used. The network diagram of the analysis of the app clearly shows hardcoded communication with Chinese servers. This falls far outside of data best practices when user data is then being sent to biometric, voice and data analytics companies based in China. IT and security teams need a way to understand the data handling and transfer practices of any app on an employee device. Some app permissions that seem innocuous to the individual end user may be malicious in the corporate sense and violate compliance policies.
“This serves as a reminder to individuals that they shouldn’t share sensitive data over personal channels. We now seamlessly transition between our work and personal lives on a mobile device, increasing the risk of users inadvertently sharing corporate information on social media, even if it’s with another co-worker.
“This incident shows how important it is to have mobile security that can alert you to risky data handling practices. You also need a way to ensure apps don’t introduce malware, in order to effectively protect against data exfiltration,” Agca says.
*Image courtesy of Agca
Jeremy Turner, Head of Threat Intelligence at Coalition, says, “The Clubhouse breach puts a spotlight on a common problem for technology startups: the benefits of technology are often the prime focus or motivating factor for both developers and users, which can be shortsighted. When a technology’s value is so significant and adoption so swift, the risks come as an afterthought. Startups should be cautious of moving faster than they can keep up with security and privacy considerations. When developers push new technology into the hands of early adopters, the risks are easy to ignore or think of as a problem for tomorrow, when in reality they should develop data security measures as thoroughly as you develop new user experiences. Early-stage development risks always seem to be over the horizon, until they’re not.”
Caroline Wong, cybersecurity expert and Chief Strategy Officer at Pentest-as-a-Service leader Cobalt, says, “At the end of the day, these types of security flaws often stem from software development life cycles that fail to incorporate rigorous technical security testing. To avoid future scenarios like we’re seeing with Clubhouse, organizations must incorporate the human element into their security testing from the beginning – via threat modeling, manual pentesting, abuse and misuse cases. Organizations should avoid relying purely on automated vulnerability scanners, which cannot detect business logic flaws. Investing in the right security controls at the right time can save organizations and their customers a world of challenges down the road.”