Future Proofing is no longer possible... here's why
FULL TRANSCRIPT
All right, today we're going to talk
about a topic that is synonymous with
building PCs, and that is the idea of
future proofing. Future proofing is kind
of impossible these days. We're going to
talk about why there has been such
changing landscape over the last 15 or
so years when it comes to PC hardware
and technology sets and feature sets.
We're going to explain why the idea of
future proofing is impossible, but
things that you can consider when
upgrading your system or building a new
system or buying a pre-built or whatever
to try and be as future ready as
possible.
But it's becoming more and more
impossible these days.
Now, we're not talking about the idea of
pricing being the reason why it's
impossible. It's because of just the
changing dynamic of what PC components
are, the way they interact with each
other and feature sets that are making
their way in that's kind of being gated
behind hardware. So, the old problem
was, is your hardware going to be fast
enough for future games? Right?
resolution scaling was something we saw
that was kind of like the major future
proofing thing people had to sort of arm
against. For instance, the 1080 Ti
Pascal series was an absolute beast of a
graphics card. Yeah, I forgot we have
one behind us right there. So, the 1080
that's why it's back there. The 1080 Ti
though was a beast of a graphics card.
It was capable of playing 4K games at
its time, like at its era. Um, it wasn't
the highest FPS at 4K, but it was
capable of it. And when it came to
future titles, the card was able to run
the game. But what you had to consider
with future proofing was, can your
system last a good five years, seven
years, or 10 years? 10 years seems to be
the real number a lot of people kind of
wait to make a major platform change or
upgrade when it comes to their system.
And all you had to really worry about
was game resolution and the actual
difficulty to run a game going to
outpace the raw horsepower of your
components, whether it be CPU, GPU, or
potentially system RAM, you know, DRAM.
But the new problem now is the fact that
we have feature sets that are coming out
that are newer than older hardware
that's being gated behind the newest
hardware. So what do I mean by that?
Specifically, DLSS. So, if you remember
the 1080 Ti was the last period of the
GTX series of graphics cards from
Nvidia. It moved over to RTX because of
ray tracing. That's what the R stands
for, ray tracing, right? And obviously
the RTX graphics cards were able to do
real-time ray tracing significantly
faster than, you know, CUDA cores could
because they were built from the ground
up to be accelerators to do one job, ray
tracing. Now, there was a lot of uproar
of going why is it behind a why is it
behind hardware? Why is it behind a
hardware wall? So Nvidia said, "Fine,
we'll unlock it. Go ahead and run it on
your Pascal card and watch what
happens." And it was a terrible
experience. And it's one of those things
where somebody who might have bought a
1080 Ti when it was new thought that
they were truly futurep proofing their
system for maybe 7 years, but the next
generation made the obsolescence and
really the writing on the wall of where
hardware was going really, really
obvious. The other thing that the future
has really sort of gone towards when it
comes to graphics cards is DLSS or deep
learning super sampling. That is another
hardware feature set that can't be
turned on on something like Pascal or
older cards because it requires the
tensor cores that are built in to the
GPUs. So that was a perfect example of a
major shift in design when it comes to
hardware that was impossible to
futureproof for. Now here's the other
thing. If you bought a 20 series card or
you bought a 30 series card and you were
like, "Okay, I'm probably futurep
proofed now. How long will it be before
they change, you know, the tensor core
architectures or the RT core
architectures and obviously the, you
know, CUDA corores are still around, but
now you're starting to see feature sets
become locked behind hardware again.
Obviously, when it comes to like DLSS
3.0, DLSS 4.0, no DLSS 4.5. You'll find
that older RT RT cards that are not more
than anywhere between 5 to 6 years old
are incapable of running the latest
feature sets because once again they are
locked behind hardware. So when it comes
to DLSS though it's still not in cut and
dry. Like you can't just say well FSO FS
or again now when it comes to upscalers
like DLSS and FSR it's still not super
cut and dry. Like you you can't just go
well my card says it or DLSS3 is out so
I'm getting all the DLSS3 features. So
what I mean is like if you have an RTX
card, you have access to all the
upscaler tech as it comes out, but
there's other technologies that launch
with a major version change. So for
instance, DLSS 1 and DLSS2 and three and
four and 4.5. If you have an RTX card,
you get the upscaler that's that's part
of the hardware, the tensor cores and
all that, but there's other feature sets
that can get locked behind there. Like
we just talked about a moment ago with
frame gen 30 series cards were the first
cards that were truly locked out of a
DLSS feature set for again. And I know
you guys don't care about frame gen, but
as an example, just hear me out. So 30
series cards don't get frame gen. 40
series cards had frame gen 2x, which is
just frame gen on, right? That means
every other frame is an AI rendered
frame. It's a fake frame. We know how
that works. 50 series was the first card
now that we saw that locked 40 series
out of a feature set in DLSS. For
instance, multiframe gen. So we're
talking about 2x, 3x, 4x, and I think
there's five and 6x now. But you see
what I mean? where there could be future
technologies that are locked behind an
overall like naming scheme like DLSS,
but if you don't have the hardware that
supports it, you're locked out of those
feature sets. AMD is kind of under fire
right now because FSR4 is the first FSR
feature set that's actually based on
machine learning where you have to have
a 9000 series, so a 960 to 9070 XT
graphics card to actually utilize that
and turn it on. And this is from the
brand that has been notoriously
backwards compatible and doing things
via software rather than hardware to
make it as widely available to their
users. And even people that don't have
AMD graphics cards could still run FSR
because it was a softwarebased thing.
But now you can't even run FSR4 on an
Nvidia card if you wanted to because of
the fact it's locked to hardware on AMD
side with 9000 series GPUs. But with
that said, modders have actually been
getting FSR4 to work on older series
graphics cards from AMD 7000 series and
whatnot. um showing that that's really
less of a hardware lockout and more of a
policy forced obsolescence of a graphics
card for AMD users to try and get them
to buy more, but it's an example of
hardware gating when it comes to new
technology, which is something that is
nearly impossible to plan for today. So,
you're starting to see the ability to be
futureproofed
more and more difficult because of the
fact that it's not just a horsepower bet
anymore. It is a horsepower and feature
bet on whether or not the features are
going to be something that's available
to you in the future. Now, you have to
ask yourself, are these features that
are important to you? Are these features
that are absolutely musthaves for you?
And if they are, well, now you're stuck
in this conundrum. Do you buy the
highestend graphics card that you can
personally afford right now to hope that
future titles that come out that maybe
are relying on feature sets like DLSS is
something games heavily rely on today?
It's unfortunate that developers have
really leaned on the efficiency of DLSS
to kind of shortcut some of the
optimizations of their own titles and
relying on something like DLSS or even
Frame Gen at this point, which isn't
something you should really consider
with future proofing. I think most most
PC enthusiasts and gamers these days
just know frame gen is not a solution.
It's not a solution to a horsepower and
or an optimization problem. It's a
band-aid. And I think we could all agree
it's it's not something we should be
that excited for, especially when they
keep upping the amount of frame gen. We
had frame gen which was every other
frame was alternating a a fake frame
essentially AI rendered and then 2x and
then 3x and 4x and now 5x and I think
even 6x or something like that. Nobody's
running that and going look at how good
my frames are. Everyone knows it's a
it's a terrible band-aid. Anyway, I
digress. It's an example of a future
feature set that again is locked behind
hardware where if you had a 30 series
graphics card or a 40 series graphics
card, you were locked behind 2x or or
less, right? Or off. And then obviously
50 series allowed it to go all the way
up to these crazy X numbers that exist
now. But it DLSS and feature sets like
that have become band-aids to where
games developers are not optimizing as
well as they should. Now, that also
means if you're on older RTX hardware,
let's say you bought a 3090 Ti, an
extremely expensive graphics card for
its time, although cheap by today's
standards, is something that is not
going to be capable of running the
future versions of DLSS that come out,
right? It's already locked, I think,
DLSS 3.0 or older. It can't run 4.0 or
4.5. And so what's happening now is game
developers are going to start relying on
this new technology that's coming out
from the, you know, FSR for AMD and DLSS
for Nvidia because they're sending their
engineers to these developers, right?
AMD and Nvidia to optimize for those
feature sets and get them implemented
into their games. So what'll start
happening is now you're going to feel
like you're on a 1080 Ti again because
games are going to rely on hardware and
features that are you're locked out of
because they don't physically exist on
your card. At least that's a story we're
being told.
still not entirely believing the fact
that these are feature sets that are
truly locked behind hardware, but that's
the story that's being told. Now, when
we talk about CPUs, it's the same sort
of thing. When it comes to future
proofing, people that adopted maybe say
Zen 3 architecture or you know like a
5950 CPU which was a a beast of a CPU on
AM4 immediately found themselves at a
dead end of a platform because if you
adopted it then you adopted it at the
end of its life cycle because remember
Ryzen and AM4 came out back in 2016. So
you might have gotten yourself a beast
of a system but you also jumped on the
train at the last stop. So now you
aren't necessarily futurep proofed at
that point because you don't have an
upgrade path. You have hopefully a high
enough CPU that it's going to be
relevant when it comes to uh what games
are demanding in the future. Now when we
look at titles like GTA 6 and some of
these other future titles that are
coming out, it's getting a little scary
at how high the horsepower requirements
are for your systems these days. And I
think people that are still running 3000
series Ryzen, maybe 4,000 series Ryzen,
are going to start feeling the pinch a
little bit when it comes to CPU
bottlenecks starting to creep up into
these titles that are going to be
heavily CPU dependent. As we deal with
more openw world and sandbox type
titles, it becomes really, really
apparent that your CPU can quickly
become the bottleneck. But I bring up
AM4 because what happened right after
that? A socket change to AM5. A DDR
change to DDR5 being required. And now
we're already hearing rumors about AM6.
We're hearing rumors about DDR6. We're
hearing rumors about PCIe 6. Although I
don't I'd be surprised if PCIe 6 for
graphics and, you know, full 16x slots
would happen so quickly around the heels
of PCIe 5.0. But what's also happening
here uh when we talk about these these
major changes when it comes to sockets
is the fact that if you have to change
the entire platform to adopt the newer
technology then you're definitely not
future proofed at that point. Right? The
idea of future proofing first of all is
a lie. There's no such thing as future
proof. Your system will become obsolete
at some point. So let's just go ahead
and get rid of the word proof to begin
with. I like to call it future ready as
best as possible. Now, Intel although
they are struggling in many ways kind of
did it right where they had that interim
period where uh you know the 12th gen,
13th gen and 14th gen were able to
utilize either DDR4 or DDR5 which was
which was kind of nice. Uh giving you a
little bit of a buffer buffer period
where if you were coming on an older
platform and you had DDR4 RAM that you
wanted to use, you could throw it into
the newer platform getting the
appropriate motherboard and have more
longevity and lifespan to your system.
AMD didn't do that. that it had a hard
cut off and required DDR5 uh from the
start. And the same thing is going to
happen when DDR6 comes out. Now, when
that's going to be, given all the RAM
stuff going on right now, it probably
got pushed out a little bit, but it's
the kind of thing you have to consider
when we talk about uh our future
proofing or future readying our system.
Now, here's another trend that we've
seen obviously over the last 6 years or
so is the incredible increase in power
requirements. The only parts of your
system that I think you can truly
futureproof, and I'm just going to say
the word proof, as much as I hate that
word because it's it's simple and we
know what we're talking about here, is
your power supply and your case. That's
about it. And the reason why I say that
is even monitors for a while there to
use features like freync and G-Sync and
all that had require they had
requirements to to be able to be ready
for that technology. So, there became a
point where even your monitor, which you
might have bought that was a super
high-end 4K or something like that,
right before these technologies came
out, well, you're locked out of those
techs too because of the fact that your
monitor isn't isn't Freync ready or
G-Sync ready or certified or compatible,
whatever you want to call it. But a
power supply just does one thing. It
delivers power. Delivers 12vt, delivers
5 volt and 3.3 volts. That's all it
does. Whether or not you have the right
cables or not for say ATX 3.1 or any of
that can be remedied by getting an
adapter or plenty of cable companies are
making native cables to plug directly
into your power supply even if it's an
older type to convert it to the new
cable. So you can kind of be ready
hardware wise like that. But with the
increasing demand of the power that
these parts are requiring I mean we've
got 250 to 300 watt CPUs. We've got 600
watt to now 1,000 watt 5090s,
which means that an 850 watt power
supply just 5 years ago, which might
have seemed like more than enough to
survive several iterations and upgrades
of your system, is now not enough for
even something like a 5080 to be
comfortable. So, you got to start
thinking about that now. Now, you got to
start building in even more headroom if
you're buying a power supply because you
might find yourself, who knows, nextg
having to buy a new power supply because
we're seeing these incredible jumps in
power. For instance, the 2080 Ti, which
was a 250 W card, kept the same TDP as
the 250 W 1080 Ti. So, that jump from
one generation to the next meant your
power supply was safe. Then we saw the
350 watt 3090. Then we saw the 450 watt
4090. Don't forget overclockable up to
600 watts because of the cable spec. Now
the 5090, if we're talking enthusiast
grade hardware, right now is requiring a
600 watt PCI Express delivery, which
means 1,200 watt power supply is really
the real requirement here for safety. If
we saw a 250, then a 350, then a 450,
and then a 600 jump from just those few
generations,
and now we're seeing custom card
builders making cards with dual plugs,
which we all know should have always
been the standard for anything running
450 or more. If that becomes the new
norm for the next gen of graphics cards,
are they now going to be 800 W out of
the box? Because if you have two of
those cables, the graphics card
companies are going to say, "Hey, now we
can push the power limit even more. It's
a little bit safer." Who knows what the
Nvidia spec is going to be. Nvidia has
clearly endorsed these crazy jumps in
hardware power requirements. Your power
supply is going to start feeling
underspected very, very quickly. Doesn't
mean you should run out and immediately
buy a 1500 watt power supply if you're
not an enthusiast buying that level of
hardware. But if you even look at like
the 80 tier or the 70 tier of graphics
cards, we've seen big jumps in power
requirement even for those graphics
cards because now a 5070 is pulling the
exact same power wattage or power draw
as a 2080 Ti was. So that is the writing
on the wall of what power demands in the
future are going to be. Things like
Moore's law are kind of like those rules
are being bent and broken these days.
Right now we're kind of in this era
where it's just give us and by give us I
mean the the engineers are saying all
the performance possible damned be the
power draw. Just let it be what it is
and let the system the integrators and
the motherboard manufacturers and the
graphics card manufacturers um build
their power delivery systems and their
VRMs to handle the power requirements.
So, we're seeing this crazy scaling of
power draw where a high-end gaming
system just seven years ago would have
pulled maybe 600 watts from the wall.
We're pulling more than that from the
wall just to feed the graphics cards
these days. So, you can see where uh it
really becomes difficult to try and have
the idea of future proofing. The only
reason I'm really telling you this right
now isn't to try and doom and gloom you
to say, "Well, future proofing is dead."
I've made this video numerous times over
my YouTube career. And the the argument
has always been the same that future
proofing is not possible. It's just how
long will your your PC stay relevant or
stay within the requirements of the
titles that you want to play and be
ready for future titles. That's all it
is that you're hoping to do here when
you build a quote unquote future proof
system is knowing that you can play
titles well into the future without
being forced to upgrade your system.
Remember, being forced to upgrade your
system is truly what makes you not
future ready. And not all of these
titles and these future technologies are
something that you have to run. I think
DLSS is one that if you haven't
experienced, you're running really old
hardware and you don't have DLSS turned
on and you you're really missing out.
It's a tech that a lot of people pushed
back against in the early days because
it kind of was a hot mess. It was noisy.
It had all sorts of artifacts and it was
just not a great experience. Today,
everyone on both sides of the fence, AMD
fans and Nvidia fans, I think, can agree
whether we're talking about FSR or DLSS,
it is a an amazing technology because
you can't truly tell the difference
between the upscaled base resolution.
So, that's one thing I think you should
hop on if you have pushed back against
and haven't upgraded yet. That's a
that's a a feature worth upgrading for
and that will make you more future ready
when it comes to titles that are coming
in the future because guess what?
They're all going to be like relying on
DLSS to make the game playable. I think
it's just great features that have made
developers a little a little more lazy
when it comes to truly optimizing for
the hardware. But if you want to truly
be futurep proofed one, I hope you have
deep pockets because you're going to be
upgrading every 2 to 3 years now because
of how feature sets are are popping up
and how often you're getting gated
behind hardware walls to be able to
implement these newer feature sets. uh
you're going to definitely have to
overspec your power supply because I
don't think like we would all love to
hear, let's just use Nvidia as an
example because they are dominating the
GPU space when it comes to gaming AI
stuff on the side when it comes to
gaming graphics cards. They are just
dominating having the most cards
available and the highest performance
that you can buy if you have very very
deep pockets. I don't expect Nvidia with
the 6000 series to go. We've kept the
same performance at 40% more power
efficient. I'm just not expecting that
to be the case. I'm just not. Uh, that
used to be the argument. Remember,
performance per watt used to be
something people were super excited
about. Now, let's let's be fair. Us
gamers really push back against it when
a brand goes, "Look at the performance
per watt." Cuz as much as we like to get
up in arms, we also are all kind of, you
know, we're we're arrogant and we like
to say, "But show me the performance."
And if the performance is the same at
lesser watts, gamers go, "Oh, that's not
an upgrade." So, let's be honest. Let's
be real with ourselves. We as gamers are
very difficult to please, especially PC
enthusiasts. So, we've got to take
responsibility for that as well. And
especially if we've been loud enough to
say, "No one cares about performance per
watt. We just want the performance." And
then they start giving you the
performance at incredibly inefficient
values. We can't then be like, "BUT LOOK
AT THE POWER DRAW." LIKE, you can't have
it both ways, guys, right? Let's just be
honest. It's hard to be retros look in
introspective at our own biases and
such, but I think most PC enthusiasts
would probably say, "I want the
performance over the efficiency." And
that's that's the reality there. But to
truly be future ready, especially with
so many titles right now that have a lot
of unknowns around them, it's probably a
real 2 to threeyear upgrade cycle at
this point. And that is expensive and
very difficult to keep up with your
case. It's just a box that holds your
your stuff that air can flow through,
right? But let's talk about things like
BTF. If BTF becomes a future like
standard, I don't think it will be. But
if it did, and you don't have a BTF
case, now that's back to front or back
to the future, whatever you want to call
it cuz they keep changing the name. Or
project zero from MSI, which is just
where the plugs are on the back of the
motherboard where they don't show it
all. If you don't have a case that
supports that and that somehow becomes
the standard, which I highly doubt, then
you're buying a new case. Now, when it
comes to future proofing, I think I've
given you some examples of as to why
calling anything future proof is
absolutely wrong and impossible. Um,
there's plenty of things that you can do
to try and stay as future proofed as
possible. Like, I already gave you my
recommendation of, you know, case and
power supply are about the only two
things that you could possibly be as
future ready as possible without being
left out. But here's the hard truth.
Future proofing used to mean you buy the
highest hardware that you can today just
to lower settings in the future to make
it last as long as possible as it starts
to age out. But the reality is today you
buy high-end hardware and hope you don't
get locked out behind, you know, pay
walls for feature sets. It's kind of
like hardware DLC. Now we've all we've
all been screwed by by gaming
developers. Now we're getting screwed by
the hardware manufacturers.
So sound off down below if you guys have
built yourself a system that you thought
was future proofed and you got slapped
right upside the face by reality
striking that you absolutely were not
because you got locked out of feature
sets. Thanks for watching guys. As
always, we'll see you in the next one.
UNLOCK MORE
Sign up free to access premium features
INTERACTIVE VIEWER
Watch the video with synced subtitles, adjustable overlay, and full playback control.
AI SUMMARY
Get an instant AI-generated summary of the video content, key points, and takeaways.
TRANSLATE
Translate the transcript to 100+ languages with one click. Download in any format.
MIND MAP
Visualize the transcript as an interactive mind map. Understand structure at a glance.
CHAT WITH TRANSCRIPT
Ask questions about the video content. Get answers powered by AI directly from the transcript.
GET MORE FROM YOUR TRANSCRIPTS
Sign up for free and unlock interactive viewer, AI summaries, translations, mind maps, and more. No credit card required.