Windows 11 [2021]

my issue on windows 10
https://forum.beyond3d.com/posts/2212290/

is kinda fixed on windows 11
- Fingerprint sensor still gone
- Windows store "install" button still didnt work
+ If i scroll down on windows store, there will be a slide-down top-bar with "install" button. This button works!
 
Last edited:
  • 64-bit CPU with virtualisation, i.e. Intel VT-X, AMD V;
  • Native Mode UEFI (with legacy Compatibility Support Module (CSM) disabled);
  • UEFI 2.5/2.6 features like Memory Attributes Table (MAT), Windows SMM Security Mitigations Table (WMST), and Secure Memory Overwrite Request (MOR) v2;
  • Second Layer Address Translation (SLAT) in Intel VT-x2 Extended Page Tables (EPT) and AMD V Rapid Virtualization Indexing (RVI);
  • IOMMU virtualization in Intel VT-d, AMD Vi, or ARM SMMU;
  • Trusted Platform Module (TPM) 2.0, hardware or firmware-based;
  • HVCI enabled drivers.
With all this virtualisation will (some?) apps be running in sandboxed VMs?
Can I get (legacy) games to think they're running fullscreen at whatever resolution but at Desktop level it be a resizable/snappable window?
 
The more I see of it the less I like it.
It has a lot of nice things underneath, but almost all the changes to the ui are steps back regarding ux.
I foreshadow a revolution in about 3/4 years for now, where they revolutionize the ui with things such as ribbon and a number indicating how many notifications you have unread.
 
With all this virtualisation will (some?) apps be running in sandboxed VMs?
Can I get (legacy) games to think they're running fullscreen at whatever resolution but at Desktop level it be a resizable/snappable window?
I wager they are, and I also wager you can still run full-screen 3D apps without needing to do the "full screen borderless window" hack. Despite me bringing up Palladium (Microsoft's project name from two decades ago), really the ultimate goal is a Trusted Compute Platform as described by the EFF here: Trusted Computing: Promise and Risk | Electronic Frontier Foundation (eff.org)

Here's a summary of the article:
In Microsoft's account of the trusted computing architecture, the anticipated changes are divided at a high level into four groups, all of which require new hardware to be added to today's PCs. These are
  1. Memory curtaining
  2. Secure input and output
  3. Sealed storage
  4. Remote attestation
Each feature has a different security rationale, although the features can be used in conjunction with one another.

For the benefit of my B3D'ers who want the TL;DR version, here you go...
  • Memory Curtailing
    Hardware enforced memory isolation, where the core OS actually has no access at all to memory pages within a "curtailed" application memory area, and vice-versa. Essentially, a hardware sandbox for memory pages, providing an even deeper level of protection when combined with the old standby Address space layout randomization. This thwarts malware which would steal memory pages from within user space or, a more secure case, where an OS is somehow comprimised but still cannot access the application memory space.
  • Secure Input and Output
    Basically, the core OS shouldn't be able to see your application input and output streams, and vice-versa. There would be direct, secured channel and related, isolated driver stack from the human input device to the application, and then out of the application into the display driver stack and into the display device itself along basically an HDCP channel.
  • Sealed Storage
    I'll just copy-pasta this from the article itself, as they did a great job explaining it: "Sealed storage is an ingenious invention that generates keys based in part on the identity of the software requesting to use them and in part on the identity of the computer on which that software is running. The result is that the keys themselves need not be stored on the hard drive but can be generated whenever they are needed-- provided that authorized software tries to generate them on an authorized machine."
  • Remote Attestation
    In short, using cryptographic hash checks to ensure an application hasn't been altered, either locally or remotely. If the app changes hash values, then it isn't the same app anymore and any "permissions" it might have had (see also: the three items above) are invalid. This one has some issues, because an entirely valid application update / patch would change the hash. Does that make it bad? On the other end of the spectrum, a malicious application would still have a "valid" hash, so a user could still grant it permissions if they're ... well, a typical Windows user ;)
I'm not sure how far into the paint Microsoft is going with the TCG model, however given the hardware requirements, they're obviously doing Memory Curtailing. Also, as far back as Windows 2000 contained the basic technology to accomplish Sealed Storage, albeit not in a per-application level (see also: File Encryption - Win32 apps | Microsoft Docs but swap user certs for "locally signed" application certs and you've got it.)

The secure input and output is a bit more challenging; IOMMU (native hardware passthrough) gives us the ability to pass hardware directly to a virtual sandbox, invisibly to the base OS or other apps. This might work for input devices, passing the HUD stack into the foreground application. For output devices, pure hardware passthrough it wouldn't work if the app wasn't in full screen. I'm not sure how they would solve this in today's display software stack.
 
Last edited:
Anyone got way faster boot time after upgraded to windows 11?

Its like, the blank screen between oem logo and windoes lock screen is near instaneous
 
Before someone asks: the Encrypting File System capability in NTFS (available in Windows 2000 and later, I corrected my post above) protects against a very different attack vector than BitLocker (available in Windows Vista and later.)

BitLocker ensures your disk is only accessible via A: your machine which circumvents sideloading the drive, and B: was booted from an OS which has access to the private key. What this really does is protect the security of the core of your bootable operating system. However, once the machine is booted into the OS, BitLocker is completely done and out of the way and the system is now vulnerable to filesystem attacks.

The encrypting file system (EFS) capability doesn't encrypt the whole disk, instead it encrypts individual files and/or folders using the discrete user private key. Thus, an admin user or even NTAUTHORITY cannot access the files because they have no access to the private key. This is intended to protect user data rather than the integrity of the base operating system.
 
BitLocker ensures your disk is only accessible via A: your machine which circumvents sideloading the drive

OMG. I thought as long as I have the key I can sideload the disk on another computer. Thanks for this info.

Gotta disable bit locker on all of my devices tomorrow. Because I have really bad luck with electronics and the ability to simply plug disks to another computer has saved my ass multiple times in the past
 
If you have the enormously long key in physical form somewhere, you can still sideload it either by physically attaching it to another PC or using bootable Windows media on the affected machine. No need to disable BitLocker if you have that key handy :)

By the way, the key is not the same as the PIN you might be using to boot your machine. Some implementations of BitLocker (usually at the enterprise scale) may force the user to type a PIN to unlock their drive. This is an optional component and isn't related to the encryption key itself.
 
Microsoft removes application to verify Windows 11 requirements - relaxes requirements (guru3d.com)
Microsoft has deleted the application that allowed us to verify that our PC met the minimum Windows 11 requirement following multiple detection failures.

Apparently even Microsoft itself does not currently know what the requirements of Windows 11 are. The company itself has now recognized that it evaluates various requirements and thus the application is not testing against the final requirements and the results of your analysis may therefore be incorrect. For instance, first the company said that TPM 2.0 would be necessary but afterward it confirmed that only TPM 1.2 is required and there are also concerns about the compatibility of both AMD and Intel CPU generation.
...
Updated
: The minimum system requirements are now changed by Microsoft. Microsoft stresses that Intel, AMD Zen 2 and Qualcomm 7 and 8-series SOCs of the eighth generation would run Windows 11. The software company will also make sure through Insiders builds that Ryzen processors and Intel Gen 7 in the first generation are still adequate enough.
 
Perhaps its just something bugged in the older chips or there is an exploit that is not known to the public yet and Intel / amd want to avoid having those chips stay in use ?
Perhaps that carries over to Kaby lake but is fixed in the caby lake refresh and coffee lake ?
or considering that Intel changes sockets and chipsets so often maybe the fault is in older chipsets ?
No, it's just that HVCI code integrity works best with Intel MBEC (Mode-based execute control for EPT) or AMD GMET (Guest-mode execute trap for NPT), which are not available on earlier CPUs, and this has been known since 2017.

MBEC (Mode Based Execution Control) the culprit why only more modern CPUs can run Windows 11

https://www.bleepingcomputer.com/ne...andards-for-highly-secure-windows-10-devices/
https://docs.microsoft.com/en-us/wi...ualization-based-protection-of-code-integrity


Note how the latter uses the wording 'works better with MBEC', which sounds far more relaxed than 'requires 8-th gen Kaby Lake or Zen 2'":

Note
Because it makes use of Mode Based Execution Control, HVCI works better with Intel Kaby Lake or AMD Zen 2 CPUs and newer. Processors without MBEC will rely on an emulation of this feature, called Restricted User Mode, which has a bigger impact on performance.​

However there are still UEFI firmware requirements for Hypervisor-enforced Code Integrity as per above, which are unlikely to be implemented in systems released before 2017...


found this interesting
Nice find - it looks like they've actually been planning everything since 2013!

PDF: Overview of Windows 10 Requirements for TPM, HVCI and SecureBoot - UEFI Spring Plugfest 2015

(Well, hopefully not the mindless ditching of pre-2019 AMD CPUs, but the actual OS implementation of Hypervisor-Enforced Code Integrity (HVCI) and Windows Defender Application Control (WDAC), formerly Device Guard).
 
Last edited:
With all this virtualisation will (some?) apps be running in sandboxed VMs?
HVCI code integrity is mostly for protecting trusted kernel-mode code (OS and device drivers) from unstrusted kernel-mode code (malware). Therefore terms like 'sandboxed', 'container' etc. confer no valuable meaning.

Can I get (legacy) games to think they're running fullscreen at whatever resolution but at Desktop level it be a resizable/snappable window?
If by 'legacy' you mean 'MS-DOS', then no.

You could do it in OS 2.x-4.x, which allowed full-screen graphics games to run in a DOS box window, using a Virtual DOS machine, as well as Windows XP and 32-bit Vista/7 with XP drivers, with some limitations. But unfortunately 16-bit NTVDM is not included with 64-bit x64 editions of Windows, because AMD64 Long Mode doesn't include a Virtual 8086 mode.

I wager they are, and I also wager you can still run full-screen 3D apps without needing to do the "full screen borderless window" hack.
Native Windows applications (i.e. DirectDraw/Direct3D/Direct2D) have to directly support full-screen and windowed modes; virtualization would make no sense here, because each application already draws into a dedicated renderring surface, and the final image is composed (stretched/filtered) by the Desktop Window Manager using the GPU draw calls.
 
Last edited:
Apparently even Microsoft itself does not currently know what the requirements of Windows 11 are. The company itself has now recognized that it evaluates various requirements and thus the application is not testing against the final requirements
Oh, boy... this is really shaping to become the most botched announcement in the history of operating systems.

I sincerely hope Windows 11 would avoid the fate of becoming yet another Windows 8.x or Vista-like debacle.


This could have been the real-life dialogue between Winsows 11 and macOS 11:

"- This time, I cannot fail! [*THAWK*]
[grumbles] Rakes... my old arch-enemy.
- I thought I was your arch-enemy..."

youtube.com/watch?v=aRq1Ksh-32g
youtube.com/watch?v=mUbiOwLvSt0
 
Last edited:
have tried Windows 11 already. I really like the new font and the new interface. I am a very visual person and this is probably the most beautiful Windows to date.

I also like the absence of control panel once and for all -perhaps it's hidden-. Other than that its behaviour doesnt change much compared to Windows 10, though I just left everything as it is, I am thinking of resetting it as if it was a fresh install.

Not sure where Cyan lives, however North America is huge and yet has a relatively low density of internet connectivity outside of the major metro areas / regions.

I spent the last 10 months driving my family all over the continent (22 states and 15,000 miles!) and we had to be very intentional about where we stayed so I could continue to work my IT job even though I was remote. Coverage gets really spotty, really quickly, as you roll away from cities and major interstate highways. :(
I live in a mountainous area in Galicia, relatively close to northern Portugal. There are like 40 inhabitants in the village where I live. We live at about 900m above sea level.
 
I see no changes in the requirements though, they just want to make sure that certain modern chips are compatible with Windows 11. In the end they are asking for more or less the same but consider on adding support for the first generation of Ryzen processors.

My previous computer had a Ryzen 1500X and if I had that PC nowadays, sure, I'd find odd that lack of support for Windows 11 'cos it was a hell of a good processor to run Windows 11, 12 and whatever. It wasn't without its flaws that I experienced myself though, but that's an entirely different matter.
 
Native Windows applications (i.e. DirectDraw/Direct3D/Direct2D) have to directly support full-screen and windowed modes; virtualization would make no sense here, because each application already draws into a dedicated renderring surface, and the final image is composed (stretched/filtered) by the Desktop Window Manager using the GPU draw calls.
This is where my knowledge breaks down a bit. In a platform where the TCG expects a "protected path" from application to physical display device, the model you describe presumes a trust between the app and the compositor service (DWM.) In a world where the other three TCG controls are in place, eg ensuring DWM hasn't been comprimised because the OS is checking binary signatures and curtailing memory access, I suppose this trusted reliance on DWM is the best we can get.

I suspect Win11 fully deploys the Hyper-V feature for arbitrary sandboxing; their Type-1 hypervisor model is a little different than KVM or ESXi where it builds partitions for the "host" operating system. The "parent" partition maintains the kernel, the literal hardware drivers and basic VM management plane things. The "child" partition is basically where user space applications live, along with another copy of the kernel to make it all work and what ends up looking like pointer-drivers (virtual services consumers) to the parent partition resources. This all happens even if you never actually create any virtual machines (and if you do, they are separate partitions sitting atop Ring-1, conceptually adjacent to the child partition.)

The only hardware I have which is new enough to try out all this newfound craziness is my gaming laptop. I'd have to crack it open to swap NVMe drives, however now I'm really curious and might just actually do it.
 
If you have the enormously long key in physical form somewhere, you can still sideload it either by physically attaching it to another PC or using bootable Windows media on the affected machine. No need to disable BitLocker if you have that key handy :)

By the way, the key is not the same as the PIN you might be using to boot your machine. Some implementations of BitLocker (usually at the enterprise scale) may force the user to type a PIN to unlock their drive. This is an optional component and isn't related to the encryption key itself.

Thank you! That's a relief. I got the keys on my Ms account. Dunno how it did that. Maybe there's a key backup checkbox somewhere that I checked.
 
In a platform where the TCG expects a "protected path" from application to physical display device, the model you describe presumes a trust between the app and the compositor service (DWM.)
DWM is a user-mode Direct3D/DXGI application, not a kernel-mode service.

I suspect Win11 fully deploys the Hyper-V feature for arbitrary sandboxing
The "parent" partition maintains the kernel, the literal hardware drivers and basic VM management plane things. The "child" partition is basically where user space applications live
Hypervisor-based application 'sandboxing' was the idea behind Windows 10X - which was scrapped for performance reasons, according to multiple reports, so there have to be additional hardware abstractions in the CPU to make it work flawlessly.

Windows 11 requires hypervisor code integrity (HVCI) instead, and this only concerns kernel-mode components.
 
Last edited:
I performed a fresh install -using the Reset option-. I think this is the most gorgeous Windows to date, visuals wise. Things I didnt like..., well, making a browser default seems complicated now 'cos you need to associate every single file format a browser can open to have a "default" browser. Other than that, good riddance to control panel -though it's still there-.

Nice explanation..., and maybe he has a point when he says that they are probably going to back off on the TPM 2.0 and Secure Boot requirements. A shame somehow, but also you leave many people in the dust if you don't. Imho, best solution should ask for those requirements on any new computer in the future and have a default message for previous computers not fulfilling the requirements.

 
Is there anyone still using AMD Zen / Zen+ (Ryzen 1000/2000/1000AF-series) or Intel Skylake / Skylake X / Kaby Lake (Core 6000/7000/7000X-series) desktop CPUs?

I need you to open the 'PowerShell (Admin)' shortcut by right-clicking the Start menu, run the command below, and see if the resulting list includes number 7 (for MBEC/GEMT support):

$Win32_DeviceGuard = Get-CimInstance -Namespace ROOT\Microsoft\Windows\DeviceGuard -ClassName Win32_DeviceGuard
$Win32_DeviceGuard.AvailableSecurityProperties

Security properties are 0 None, 1 Hypervisor, 2 Secure Boot, 3 DMA protection, 4 Secure Memory Overwrite Request, 5 NX protection, 6 SMM mitigations, 7 Mode Based Execution Control, 8 APIC virtualisation.
 
Last edited:
Back
Top