The Rush To Create iOS Apps Can Leave Company Data Exposed And Vulnerable

By

FBI director isn't too keen on Apple's security measures.
Companies developing internal iOS apps need to ensure those apps don't compromise security.
Photo: 1Password

Many IT departments are under intense pressure to develop and implement a range of mobility initiatives. Those initiatives often span a range of IT disciplines. There’s the effort to develop internal apps, provide access to new and legacy systems from mobile devices like the iPhone and iPad, the need to manage and support users devices as part of BYOD programs, and the need to develop customer-facing solutions like mobile-oriented sites and native apps.

With so many pressures hitting IT organizations at the same, compromises are being made because of tight deadlines and budgets. According to security expert Jeff Williams, that push to get solutions out as quickly as possible may result in solutions that have major security flaws in them.

In an interview with GovInfoSecurity, Williams focuses the discussion around mobile app development. Many organizations see the potential productivity gains the mobile apps can offer them. In the rush to produce those apps, many of them aren’t getting the rigorous security testing that they need.

While mobile management solutions can help secure devices, Williams notes that a company’s internal apps can be easy vectors for attack if a device is lost or compromised and that device management solutions may offer little protection in such cases..

Most mobile apps have a server side and then several different clients, and the clients could be HTML5, iPhone, Android, Blackberry or whatever. Let me try to paint a picture for you. Imagine a sort of bubble that extends from your company’s data center over a whole bunch of networks – maybe some Wi-Fi and across long distance networks and so on – and ends up inside your mobile device. When you’re extending your enterprise and your data out through this bubble, now it’s your job to protect the bubble.

Here are some of the kinds of ways that there’s exposure there. When the attacker steals your phone or gets a malicious app onto your device, you’ve got to ask yourself if they can get inside that bubble somehow. You want to make sure that your data’s protected when it’s on a device; you want to make sure your data is protected when it’s in transmission between your data center and the device; and then you’ve got to make sure that your application itself is hardened. It’s got to be rugged code.

Williams also notes that some of the security challenges aren’t really new issues.

Unfortunately, we’re seeing many of the same kinds of mistakes that we saw in web application code from a decade back. Many organizations are so busy wrestling with BYOD and MBM. And they’re trying to be the first to market so they’re dealing with whatever business pressures … to get their app into the app store really quickly and they’re not really giving development the resources that they need in order to build secure code for mobile.

Addressing application security concerns, Williams offers some common sense advice, like storing as little company data within an iPhone or iPad app on the device as possible and encrypting any data on the device (functionality that Apple provides to develops through various iOS APIs).

Now for your applications, the first thing is that they should really minimize the sensitive data that you’ll allow to be stored on the phone. That’s the data that’s going to cause exposures. The best thing you can do is not put it on the phone. Maybe you can keep it in memory or keep it on the cloud but don’t allow it to get stored on the phone. If you have to store encrypted data, then I think organizations should provide their developers with an encrypted container – some kind of solution that will make sure that all the data that lands on the phone ends up encrypted, so even if the phone gets lost, stolen or compromised, that data is still protected.

He also makes a case for hardening and monitoring backend servers that are actually delivering content and data to a company’s mobile apps.

Although Williams doesn’t mention it explicitly in the interview, which is well worth reading in its entirety, it’s important to keep in mind that internal iOS apps don’t go through the same kind of inspection and approval process that Apple implements for apps sold through the iOS App Store. That means if developers do make mistakes or leave vulnerabilities in their code and apps aren’t put through a thorough security vetting process, a company may not realize there’s a risk until it’s too late.

Source: GovInfoSecurity

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.