Scaling is worse quality than native. The scaling algorithm isn't the best possible and could be (much) improved. The scaler adds a sharpening filter a lot of the time that creates nasty artefacts.
Obviously scaling is worse quality, that doesn't address the question I asked. I see no reason why the hardware would be sharpening the image by default. My guess is that function is intended to make text more legible and not meant to be used for the plane storing the color buffer. That's likely a snafu in the tool chain.
One could design a tiny SoC with 50 different, exotic functional units all with bizarre abilities. It'd be very exotic with loads of diverse functionality, cheap to manufacture being tiny, but very hard to use. Ergo, exotic != expensive. In this case, the 'exotic' aspect, which isn't terribly exotic, is an ESRAM scratchpad which was included as it was deemed cheaper than 8 GBs of GDDR5.
In manufacturing, unique components are always more expensive.
We don't know how hardwired or programmable the solution is, so scaling may not improve over time except in newer consoles with alternative scaling chips.