GettyImages-181264085
A customer tries out the Apple iPhone 5C smartphone at the Berlin Apple Store, Sept. 20, 2013. sean gallup/getty images

To hear Apple tell it, the creation of a “backdoor” to the iPhone owned by San Bernardino shooter Syed Rizwan Farook, propped open however briefly, would imperil the very concept of privacy, not to mention millions of iPhones around the globe.

As a principle and a moral argument, this view has been embraced by Silicon Valley and much of the privacy community. And yet on the separate technical question of whether Apple could safely break this one phone without unleashing havoc on the privacy world, there is consensus: yes, Apple can, even if it won't.

The court order is actually quite narrow, since it asks Apple to build a piece of software tied to the device’s unique ID for the purposes of disabling the iOS auto-erase function. That would allow the FBI to make unlimited attempts at guessing the passcode — what’s known as a “brute force” attack — without any software roadblocks slowing it down.

This custom piece of code would only work on the iPhone 5C, belonging to Farook, one of the killers in the Dec. 2 shooting rampage in San Bernardino, California. And it would happen in an Apple facility. In theory, Apple could destroy the code when it’s done.

"I don't think there's any significant risk if they do it at Cupertino and keep control [software],” said Sean Sullivan, security expert with F-Secure. “I wouldn't call it a backdoor. There's a door, Apple can make a key that opens the door, prop it open, and then get rid of the key. It's a ‘creative’ use of a service entrance."

On a technical level, there’s not much stopping Apple from performing this service for the FBI. The iPhone 5C lacks any physical security hardware that would get in the way of a software modification by Apple, according to a detailed blog by Dan Guido, CEO of security startup Trail of Bits.

Bruce Schneier, the renowned cryptographer and author of a number of recent reports on existing backdoor technology, explained on his blog that not only is the FBI’s request technically feasible, but “Apple assisted in the wording so that the case could be about the legal issues and not the technical ones.”

Recent history shows Apple chose this moment for its legal significance, not for any technical hurdles in complying safely with the FBI. Apple has complied with similar requests to unlock iPhones for law enforcement —about 70 times since 2008, assistant U.S. attorney Saritha Komatireddy argued in 2015, according to court filings. That was all before the 2014 revamp of iOS 8, which shifted iOS encryption keys from Apple’s servers entirely over to user devices running iOS 8.

As if in anticipation of a circumstance like this, Apple updated its guidelines accordingly: “For all devices running iOS 8.0 and later versions, Apple will no longer be performing iOS data extractions as the data sought will be encrypted and Apple will not possess the encryption key.”

Apple argues the very existence of this backdoor, or an established pathway into an encrypted iPhone, would be dangerous, especially if it left the building. “The key to an encrypted system ... is only as secure as the protections around it,” Apple CEO Tim Cook wrote. “Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.”

But for a company always in the public spotlight, Apple has a great track record of keeping secrets. Most of the information leaks about the company and future products come from suppliers. That’s not to say Apple will stay that way. As we saw in the transition from Steve Jobs to Tim Cook as CEO, things can change, even at Apple.

“Who is Apple tomorrow,” Apple co-founder Steve Wozniak told CNBC. “Who are the people involved? Who are their insiders that would be a little secretive, you know? It’s just like an insider can put a whole bunch of data on a flash key and walk out of a company.”

But even in the unlikely circumstance that this code got loose, Apple still has a defense. Apple requires that all iOS devices sign in to Apple servers each time the software updates. This is used by Apple to dictate which versions of iOS can be installed on a device, and it could also be used to refuse to install software. This is partially why the majority of iPhones and iPads are generally on the latest version of iOS.

Apple Inc. (AAPL) | FindTheCompany

Could someone bypass that certificate, unique code that verifies the software is legitimate, and install code without Apple’s permission? “It'd be difficult, but it wouldn't be impossible," said Michaela Menting, digital security research director at ABI Research. "First of all, if you root a device, you can do anything to it. You can write a fake certificate, spoof it and try to fool the phone into believing it's the right one.”

If they have the right resources, attackers can outright steal or gain access to the signing certificates. While it’s a rare scenario, it’s hardly unheard of. Microsoft encountered this in 2012, when it discovered that some pieces of malware were signed using one of its certificates. RSA encountered a similar breach in 2011 which was used by hackers to compromise the security of its SecureID physical tokens, which generate a one-time use code for customers, according to the Wall Street Journal.

Technologists looking at the case agree: There is no technical barrier to complying with the FBI that Cupertino couldn't surmount. Rather, it's the prospect that Apple could be repeatedly compelled to do the same for other more frivolous cases in the U.S., or to comply with the requests of more-authoritarian regimes abroad, that worries the company.

“If [Apple] can compromise user privacy for one device, what’s stopping a similar backdoor from being provided in other instances.” said David Gorodyansky, CEO of virtual private network company, AnchorFree. “And if it’s provided for the U.S., what’s stopping it from providing it for other governments?”

There’s an argument to be made that this dispute isn’t about the data on this particular phone at all. For Apple, it’s about the right not to hack its own products; for the government, it’s about the right to access encrypted devices with the assistance of their manufacturers.

“They don't even really care about the data on this particular phone (as evidenced by the facts that this is the suspect's work phone — he destroyed his personal phones — and that they're conducting this litigation in public rather than under seal). They chose this particular set of facts to create a precedent,” Nate Cardozo, staff attorney for Electronic Frontier Foundation, wrote via e-mail.

With Apple increasingly looking to markets overseas to drive growth, it’s also in the position of trying to shore up the confidence of its customers in foreign markets.

“Technology companies need to rebuild their credibility with the international market, so certainly Apple wants to fight this publicly,” said Jim Lewis, director at the Center for Strategic and International Studies. “But I don’t think it matters what the U.S. market thinks — Apple has to reassure global customers they’re not giving U.S. agencies unlimited access to customer information.”