A customer tries out the Apple iPhone 5C smartphone at the Apple Store on the first day of sales in Berlin, Sept. 20, 2013. sean gallup/getty images

Apple Inc. CEO Tim Cook has pushed back against a court order that would force the company to develop backdoor access into an encrypted iPhone, arguing that creating a security bypass for one iOS device would effectively create one for all iOS devices. But despite Cook’s resistance to the order, cybersecurity experts say it may be technically feasible for the company to comply with the request without exposing newer iOS devices.

The smartphone in question is an iPhone 5C that belonged to Syed Rizwan Farook, one of the killers in the San Bernardino shootings. The order, issued by the U.S. District Court of Central California, forces the iPhone maker to assist the FBI in bypassing the passcode lock of the smartphone.

Here’s an excerpt of what the court has ordered Apple to do:

“Apple’s reasonable technical assistance shall accomplish the following three important functions:

(1)it will bypass or disable the auto-erase function whether or not it has been enabled;

(2)it will enable the FBI to submit passcode to the SUBJECT DEVICE for testing electronically via the physical port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE; and

(3)it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.”

More simply put, the court wants Apple to disable the function that erases the iPhone after 10 wrong passcode attempts while allowing the FBI to connect an external device to try an unlimited combination of passcodes without any time restrictions or delays. To fulfill this order, the court proposes that Apple develop custom iPhone bypass software hardcoded to only work with the iPhone related to the investigation.

Apple may not be willing to comply with the court order, but it may be feasible for it to create the backdoor demanded by the FBI, according to a blog entry by Dan Guido, CEO of security startup Trail of Bits. The reason for this is the iPhone 5C at the center of the investigation lacks some of the built-in hardware security features found in newer iPhones, such as Secure Enclave of the iPhone 6 and 6S, which exponentially delays repeated passcode attempts. The key piece of hardware also protects sensitive information such as credit card data used by the Apple Pay mobile payment system.

Since the iPhone 5C, released in 2013, lacks the security features of newer devices, in theory it would take only a software modification to bypass some of the passcode protections. That said, the iPhone 5C does have a hardware limit of one passcode attempt every 80 milliseconds.

Without any additional roadblocks in the way, modified software could allow the FBI to access an iPhone 5C locked with a four-digit passcode in under an hour, according to Matthew Green, assistant computer science professor at Johns Hopkins University. A six-digit passcode could lengthen the process by up to 100 times. It all depends on how complex of a passcode was used with the device.

While this bypass may be feasible for the iPhone 5C, the same software bypass wouldn’t be able to get around the passcode time delay enforced by a Secure Enclave chip in the iPhone 5S, 6 and 6S. That said, if Apple were forced to develop a backdoor, it could set a precedent that could later extend to the rest of its iOS product line.

“It would potentially — taken to the logical extreme — mean that Apple would be forced to modify the software in new devices that are being sold, so that they no longer had all those security features, said Green. “I think that’s a really crazy reading, but you start to go in that direction once you have an order like this.”