Smartphones are not safe

Not a single detail. Android, especially the latest versions, have so many privacy intruding APIs that sandboxing becomes irrelevant. Through access to those APIs, third party apps become virtually system apps. I am not even mentioning the presence of Google apps, which are legalized spy/mallware.

Linux is still the most secure OS.

And ? What we all shall do in this case? Any solution?

Don’t become a target of State investigations.

It is necessary to create a smartphone where, as a separate encryption layer on top of the encryption based on the security chip key, there will be a user password, so that even if the security chip is hacked, it will be impossible to decrypt the data since the password will not be stored anywhere, and of course, so that the phone enters an encrypted state not only after a reboot, but also after the screen is turned off. It is also desirable to encrypt the entire system except for the password entry window and the processes responsible for decryption, rather than just the user data, in order to reduce the attack vector.

Imagine you were just walking down the street, an accident happened near you, and you are a suspect. Or you were just walking in the woods, and passers-by found it strange and suspected you of distributing illegal substances. In both cases, your phone may be seized, and if you live in an authoritarian country, even viewing posts against the government can lead to imprisonment. Data protection is necessary. These are not empty words. Everyone has the right to the privacy of personal correspondence.

What about GrapheneOS on Pixel?

1 Like

Where the author is correct

1. There is no absolute smartphone invulnerability

Yes. Any mass-market device will eventually get exploits—through SoC, Secure Enclave / TEE, USB stack, baseband, DMA, etc. Cellebrite, GrayKey, and similar tools really work, especially in AFU.

:check_mark: This is undisputed.

2. AFU is the most vulnerable stage

Also true.
Once a phone has been unlocked at least once after boot, some keys are active, services are running, and the attack surface is huge.

:check_mark: This is exactly why GrapheneOS cuts USB access, reduces the attack surface, introduces auto-reboot, etc.

3. File-Based Encryption is a convenience compromise

Correct, with nuances.
FBE was indeed introduced to support:

  • Direct Boot

  • Alarms

  • Phone calls

  • Services running before device unlock

:check_mark: It is a trade-off between UX and security, not a “pure win” for security.

4. The user does not directly control the keys

Yes.
You do not “enter the key” yourself. Instead:

  • you enter a password

  • it participates in derivation

  • the Secure Element decides whether to release the CE keys

:check_mark: This is an accurate description of the trust model.


Where the author is mistaken or oversimplifies

:cross_mark: 1. “Your password does not participate in encryption”

This is incorrect.

On modern Android:

  • password → scrypt / Weaver

  • used to derive keys

  • without the password, the Secure Element will not release CE keys

The password is not just a “signal”; it cryptographically participates in the process.

:backhand_index_pointing_right: The claim “the key is stored and can simply be extracted” is a forum-level oversimplification.

:cross_mark: 2. “If the chip is hacked, the data is immediately accessible”

Not quite.

Even if compromised:

  • rate-limit bypass is needed

  • hardware delays must be bypassed

  • memory access is required

  • proper boot context is required

This is why:

  • BFU is often not compromised

  • AFU is not always compromised

The author presents the Secure Enclave as a “cardboard lock.” This is false.

:cross_mark: 3. “Double encryption = absolute protection”

This is naive thinking, very common.

Why:

  • if the SoC is compromised → password input can be logged

  • RAM can be attacked

  • TEE can be attacked before key erasure

  • attacks can occur before screen-off

  • side-channel attacks are possible

:backhand_index_pointing_right: Two layers ≠ magic. It only reduces risk, not guarantees invulnerability.

:cross_mark: 4. “FDE was safer than FBE”

This is partly false, partly nostalgia.

True:

  • attack surface was smaller

  • nothing worked before password input

But:

  • old FDE had weak key management

  • worse multi-user protection

  • worse isolation

  • worse rollback protection

FBE is cryptographically stronger but architecturally more complex, and complexity = new attack vectors.

1 Like

GrapheneOS, yes, it best in fight with Cellebrite. Simply because it protects against key extraction by closing the USB stack, but it does not change the encryption architecture.

Modern FBE systems (if implemented correctly) can:

  • keep keys inside a hardware-backed enclave,

  • strictly limit key lifetimes,

  • and be just as painful for forensic extraction.

Its to difficult for me and i again ask GPT )

This is precisely why Google’s Pixels can’t be trusted, as unlike any other device, in Pixels, Google have total control over firmware, with plenty of ‘features’, which become vulnerabilities once discovered. A recent example: In 2024 It was discovered that Pixels didn’t enter BFU state when booted into fastboot.

Kindly stop using AI to draft your responses.

5 Likes

Use GrapheneOS and thats it

1 Like

On Pixels?!?! Did you read my post above? LOL.

You mean a ’ghost phone’ with a ‘new’ OS that’s magically compatible with Android apps?

Google - Android developer. GrapheneOS - Android-based. Pixel open source smartphone. GrapheneOS open source too.
GrapheneOS = Android+ Security.

GrapheneOS foundated in 2014.

Fast boot? Fine. But what about 72-hour grace period to auto-remove old password after changing it on Samsung? What about the almost complete lack of documentation? Cellebrite can hack Samsung in BFU. Pixel + GrapheneOS - no.

No. It is NOT!!! Only Android part is open source. Firmware is NOT. Processors’ firmware’ is not. You can’t build firmware. You can only include binaries.

Android does not have access to processors’ firmware, which is a mini OS in of itself. Android have no control over it. This is why that glaring Pixels vulnerability (access to unencrypted data in fastboot) could only be fixed by Google: The back door was in closed source firmware.

Edit: You are simply misinformed. Google does have some old repositories for its Titan chips, but you can’t build from them for Pixels. It’s like Chromium: Google uses some sources from open source Chromium, but the finished product - Chrome is a black box. The same is with Pixels firmware: Black box.

Knowing Google’s appetite for grabbing data, Pixel’s firmware is Gapps on steroids, because Gapps can be disabled or not included. You can’t do the same with processors’ firmware and other binaries. Without them, the phone will be a brick.

GrapheneOS can’t modify the chip’s firmware, but it can modify the Linux kernel (Android is based on Linux).

GrapheneOS patches the kernel so that during a reboot or shutdown (including switching to Fastboot/Bootloader), the RAM is filled with “garbage.” This erases encryption keys from RAM before a potential attacker can extract them via a cold boot or bootloader bug.

Titan M State Management: In the Pixel, the security chip (Titan M/M2) is responsible for storing keys. GrapheneOS modifies the logic for interacting with this chip so that when a reboot command to special modes, access to the protected key slots is immediately blocked at the hardware level.

Process Termination: If the system detects an abnormal shutdown, protection mechanisms attempt to prevent the “After First Unlock” state from being saved.

Driver Patches (Kernel-level Patches)
A driver is a “translator” between Android and the processor firmware. If there’s a bug in how the processor processes data, GrapheneOS can patch the driver in the system kernel. How it works: Instead of patching the hardware itself, it changes the way commands are sent to it, bypassing the vulnerable areas.

I’ve edited my previous post: Read ‘edit’ about Titan. It is black box containing whatever Google wants. It could contain all kinds of back doors, as evidenced by the unencrypted data vulnerability I’ve described.

In my view, the only reason Google has decided to design firmware for processors (for which it has little to no experience), is the ability to hide spyware and malware in firmware. Designing chips requires not only multi billion $$$ for Research & Development, but also 20, 30, 40 and 50 years of experience like Intel, AMD, Qualcomm and even Samsung, and Google totally lacks that.

So, no matter how good (or bad) your favorite custom Android distribution is, it is futile on Pixels. On Pixels, it’ll be a house with a good front door and no back side.

By the way, let’s abstain from discussing GrapheneOS. This is not about them, but rather about encryption on smart phones. GOS doesn’t do FDE. So, take your ‘arias’ about GOS to their forums. There, everybody will pat your back. If you continue, I will express my opinion about GOS too and in great details. Hint: It is very low. But this discussion doesn’t belong here.

Qualcomm has a lot of vulnerabilities — for example, EDL. After 2022 have been no documented Cellebrite hacks of Pixels running GrapheneOS, unlike for example Samsung devices. Yes, we haven’t fully safe OS. But this is best in available list. What you recommend if not GrapheneOS?