When American and British spymasters wanted to dip into information about security “suspects” – and anyone else – they simply sent out an agent on mobile networks, gathering data collected by apps that transmit information about users to marketing companies. According to the latest blockbuster revelations by Edward Snowden, the NSA and Britain’s GCHQ monitored “leaky” apps like Angry Birds to gather personalized data on millions of people.
Nobody would accept a leaky pail or a bottle that dribbles, but for some reason, according to Erez Metula, CEO of AppSec Labs , we’re willing to put up with poorly-coded apps that allow hackers – government or otherwise – to get hold of sensitive data, using backdoors to invade apps and raid the information we have on devices.
“People who write code have traditionally left security to someone else, but increasingly we are beginning to understand that you need to have security built into the lowest level of an app, in the programming code,” said Metula, and his company has the tools and training to enable developers to do do just that.
If they hadn’t understood it before, the new revelations about data leaks have driven home the point about the need for code-level security. According to reports, America’s National Security Adminstration (NSA) and Britain’s GCHQ have been monitoring Angry Birds and other apps to gather data on users. Information collected ranged from the phone model and operating system on a device, to the age, location, and other personal details of the user.
Rovio, the company that makes Angry Birds, vehemently denied having anything to do with the surveillance, suggesting that the NSA got the information from advertising networks which Rovio uses to monetize the app. While the ad networks don’t have access to information like names and addresses, they do track browsing history, location, and, very often, a phone’s unique ID. Spymasters, as well as hackers, can exploit programming holes in apps (Angry Birds is far from the only app to be monitored in this way, according to experts), gathering the transmitted information to its own servers. With the phone’s ID, it would be easy for the government to build a complete usage profile for individuals.
But there are ways to prevent data leaks from apps, Metula told The Times of Israel. “We offer services, like penetrating testing, that help developers ‘tighten up’ their app security, but the most important service we offer is training programmers in how to ‘bake’ security into their apps. If they can plug up the potential holes in the code, the chances that hackers will be able to exploit apps for data will be greatly lowered.” In the case of transmitted data, for example, that might entail encoding the information sent to ad networks, or installing anti-hack software to turn off the flow of data when a rogue agent is detected.
Defending an app against exploits is similar to defending a server against them; testers look for backdoors, known as exploits, and vulnerable code that opens data up to “drive-by” trojan horses or viruses that collect information. For various reasons, said Metula, secuity had not been on the radar of programmers. “It was always thought of as a luxury, a burden that was off-loaded to the security department, which added on a component to defend the data.” But as hackers have taken advantage of weak coding, it’s become clearer that apps need “deeper” security and “not a bandage to be applied to an app after it was already built.”
There are a lot of apps and systems that need to be tightened up, and Metula and his team can’t handle them all – which is why the company’s main business is training others to teach security coding skills. “We train trainers, creating the content and teaching teams in how to pass that content on,” said Metula. Besides ensuring that developers will learn better coding skills, by building a wide-ranging distribution system the company is setting its methodology as a standard platform for app testing, which hasn’t existed until now.
“With the proliferation of security requirements for apps, some of them legislated, such as electronic health apps in the US (regulated by HIPAA, the Health Insurance Portability and Accountability Act), there is a need for a standardized testing and compliance system for app security,” said Metula. “We are hopeful that our platform will become the standard one that testers and trainers use in a wide variety of scenarios.”
AppSec Labs is off to a good start in that department. Among its customers is HP, which is using the company’s platform and materials to train its own programmers in security worldwide. “We know many other companies, like Microsoft, Google, Intel, and others have the same needs, and we are approaching them as well,” said Metula, pointing to the very positive feedback AppSec Labs has gotten from HP as a selling point for new contracts.
Metula got interested in the nexus between programming and security when he was 12, trying to get ahead in stages of computer games by hacking into games and moving himself to more advanced levels. Those skills serve him well in an era when even the most seemingly innocent game could harbor a nest of online spy agents. “I guess the router backdoors found in many network devices, the fact the devices come with built-in hidden surveillance capabilities, and the whole espionage fiasco between the USA and other nations made me start to think that everyone is eavesdropping on everyone these days,” said Metula. “There are no secrets anymore.”