Latest Thread Images
Featured Sponsor Listings
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an alternative browser.
You should upgrade or use an alternative browser.
Post your computer specs!~
- Thread starter Mysteek
- Start date
chesebert
18 Years An Extra-Hardcore Head-Fi'er
- Joined
- May 17, 2004
- Posts
- 9,354
- Likes
- 4,736

Maybe look at the numbers I was mentioning again. I am getting HALF of 15,000MB/s on a Gen 4 drive.OK... what? I'm blown away. How are you getting such high read and write speeds out of PCIE 4.0 drives? Even in raid-0 you should top out at MAX 15,000MB/s it shouldn't be any higher than that and yet you're blowing that number out of the water. I can understand the benefits of huge amounts of RAM and good caching but still-- how are you getting these numbersI
Here is a screenshot of a test 5 minutes ago on my 4TB SN850X:

With that said, before Gen 4, my M.2 drives would be about 3500MB/s (basically half of this drive). I haven't noticed a difference in performance/time to load anything, or DO anything between either drive. Except in one area... if you have an hour-long video you filmed in 4k, and used three cameras to film it, then the faster drives DO cache/buffer the videos quicker in CyberLink PowerDirector. But that's the ONLY time I noticed any difference. However, even that is pretty minimal. It's more "felt" intuitively, and I haven't timed it out because I don't do reviews on PC hardware. It's enough for me to go "Oh, it's a difference", but I didn't go "OHMYGOSH THAT'S so awesome!!eleven!!" like I do when I get a CPU that is 4x faster lol
Last edited:
SilverEars
Headphoneus Supremus
- Joined
- Sep 18, 2013
- Posts
- 15,825
- Likes
- 7,327
I'm mainly concerned with transfer speed of small files or compression or decompression. Is that dependent mostly on CPUs? I would like to see a comparison of that on different CPU performances.Maybe look at the numbers I was mentioning again. I am getting HALF of 15,000MB/s on a Gen 4 drive.
Here is a screenshot of a test 5 minutes ago on my 4TB SN850X:
![]()
With that said, before Gen 4, my M.2 drives would be about 3500MB/s (basically half of this drive). I haven't noticed a difference in performance/time to load anything, or DO anything between either drive. Except in one area... if you have an hour-long video you filmed in 4k, and used three cameras to film it, then the faster drives DO cache/buffer the videos quicker in CyberLink PowerDirector. But that's the ONLY time I noticed any difference. However, even that is pretty minimal. It's more "felt" intuitively, and I haven't timed it out because I don't do reviews on PC hardware. It's enough for me to go "Oh, it's a difference", but I didn't go "OHMYGOSH THAT'S so awesome!!eleven!!" like I do when I get a CPU that is 4x faster lol
This is a topic that is less covered from most PC "gurus" because it isn't as important a topic as it is for gaming and fps, video encoding, and raw processing numbers.I'm mainly concerned with transfer speed of small files or compression or decompression. Is that dependent mostly on CPUs? I would like to see a comparison of that on different CPU performances.
TL;DR breakdown: In most cases, a CPU will be the biggest contributor to file compression performance. Typically.
With that said, this is tricky. NVMe drives' biggest drawback is the speed of super small file transfers. For example, if there are numerous 4kb-300kb files that need to be transferred, you may notice that the copy speed goes WAYYY down compared to a single 100GB file. Older platter hard drives (HDD) did have an advantage with this category, although times from one sector to another could make a difference based on distance from the platters' center to end-point, etc...
But anyway. For small file transfers, I think ultimately it's not a huge deal unless ALL you are doing is transferring small files. Take for example, copying "God of War" on PC from one drive to another.... good luck thinking it's going to be quick, because TONS of files are really small. But when I have a video file that is 1.5GB... OMG, 1GB per second! Woah!
Compression/Decompression:
There are many factors that come into play for this. First, some file types will work better based on the CPU cores alone. A better GPU would have no bearing on performance, since they programmed the software to use a certain piece of hardware to work better. Other times, a file will need to be compressed/decompressed that will work faster with a better GPU.
Slower RAM may hinder the speed of these processes. (In my opinion, RAM timings and speeds may not be as significant for most circumstances compared to CPU or GPU capabilities, in most cases).
Finally, the speed of the hard/solid state drive may SLIGHTLY reduce the time for encode/decode, but that's the last step of the whole process (copying/storing the info). The weakest link will always be whatever is the slowest component within the chain, of course.
A file could be compressed through the RAM at super-speed, matched with the pace of the CPU and GPU, converting files at 25,000GB/s (made up scenario), and then is being written to the NVMe drive at... 400KB/s with hundreds of 300KB files. Well... there's the bottleneck.
I guess what I'm trying to get at is there are many bottlenecks (let's not bring up L1 and L3 cache, other potential blockages) that may slow down the processes of compressing and/or decompressing files. It may be limited JUST by the file type - or size - you're compressing, all the way down the chain from hardware, to the software itself. Breaking it down in a simple way is impossible.
This is also, I assume why most PC reviewers don't focus on this topic alone. There are too many variables to make realistic comparisions and recommendations.
Small file transfers - Depends on how often you do it, and if you're willing to do other things while it happens.
Compression or Decompression - tons of other factors come into play, so it's tough to make recommendations (or even test differences, because there are countless variations of one computer to the next based on how it's built).
Sorry for the long rant. This is a very difficult topic to summarize in the simplest way.
Last edited:
BrokeAudiophileMan
100+ Head-Fier
I have a 4070Ti too. Paid $1100 Canadian for it for a 70Ti. Should have been like $500 tops if we're talking 2016 reality and if we go back to the good old days of the 2000s it would have been like $250. I remember my friend got a 560Ti in 2011 for $199 and that same year I got a 580 for like $400... and in 1998 I got a Voodoo Banshee, literally the second best gaming card available, for UNDER $150!!!!I got a 4070ti, and I don't get double gain from frame gen, but about a quarter to third gain. Still, I don't like frame gen because it looks unstable. There's points in time I see massive delays to get the frame gen started or there are glitches.
We get screwed so hard.
The 4XXX series cards are all overpriced. The exception could arguably be the RTX 4090, because it is a complete BEAST in performance compared to any card in the market. Thus, the inflated price is almost justified (especially if your goal is 4k 120Hz gaming, or GPU-based number crunching, such as video rendering).
The 4080, 4070 Ti, 4070, and 4060/4060 Ti are an insult in price vs performance. Nvidia obviously downgraded and tuned these cards to match the 3XXX series counterparts almost exactly, and bumped up the price because they were capitalizing on the scarcity boom of expensive GPUs over the last couple years and screwed over consumers for profit.
They will say "Look at how much better this is because of DLSS 3.0", but if you consider these GPUs' performance on any other level of comparison, it's obvious that the new chipset can perform MUCH better with less power output, yet they gimped the cards to rip off buyers under the guise of being "better" because it uses the new DLSS format.
What's worse is the lack of more VRAM, which these days a GPU should have 12GB or more. With these cards having slower VRAM with a 128-bit bus, they would have to be 8GB, or 16GB. If it was a 256-bit then they could have done the 12GB "sweet spot". What's sad is that these VRAM parameters are a downgrade from even the 3XXX series, meaning that they cut even MORE corners to make cheaper cards, and priced them up at release.
The only good thing to come of this is that it's apparent people aren't buying these new cards as much as they were hoping. I think a big part is due to people getting a new video card from 2020-2022, thus not being incentivized to upgrade to the newer generation. Other factors are that if you look at the difference (or lack thereof) in performance, many have probably just bought a previous-gen GPU used for up to half the price, with the same performance.
This is Nvidia (less so AMD, but they're not completely free of guilt) trying to keep their profit margins as high as it was during the pandemic and lack of stock (which increased demand for many buyers) while selling a cheaper product at higher prices to maintain that level of income for investors.
Starting their next-gen line with the 4090 was a good plan. That card is insanely capable (even funnier was pawning off RTX 3090 Ti right before release at $2000, ripping off early buyers before the 4090 came out. Then, 2 months after the 4090 was released, those 3090 Ti's were going for $1100 or less lol), thus allowed them the opportunity to set their own price point.
Other cards after that were overpriced, though, and thankfully AMD was there to try and be competitive in order to take customers from Nvidia's greedy lineup. But even the AMD cards are not a good bargain. Same performance with a 7800XT - sometimes less - compared to the 6800XT which is cheaper, where available.
This is currently one of the most shamelessly exploitative abuses of influence in the GPU market over the entire history of GPU sales. What makes it hurt more is we are barely removed from one of the most amazing GPU timeframes, with the release of the GTX 980, GTX 980 Ti, then the GTX 1080 series (and the 10XX series in general). MASSIVE amounts of performance vs price at the time.
Now the generational leap from the 3060 to the 4060 is -1% to 4%, with a 20-30% price jump. Wow. This is the first time that's happened.
Previously, the new generation Nvidia XX60 card would be as good as the previous gen XX70/70 Ti release. But cheaper.
/endrant.
The 4080, 4070 Ti, 4070, and 4060/4060 Ti are an insult in price vs performance. Nvidia obviously downgraded and tuned these cards to match the 3XXX series counterparts almost exactly, and bumped up the price because they were capitalizing on the scarcity boom of expensive GPUs over the last couple years and screwed over consumers for profit.
They will say "Look at how much better this is because of DLSS 3.0", but if you consider these GPUs' performance on any other level of comparison, it's obvious that the new chipset can perform MUCH better with less power output, yet they gimped the cards to rip off buyers under the guise of being "better" because it uses the new DLSS format.
What's worse is the lack of more VRAM, which these days a GPU should have 12GB or more. With these cards having slower VRAM with a 128-bit bus, they would have to be 8GB, or 16GB. If it was a 256-bit then they could have done the 12GB "sweet spot". What's sad is that these VRAM parameters are a downgrade from even the 3XXX series, meaning that they cut even MORE corners to make cheaper cards, and priced them up at release.
The only good thing to come of this is that it's apparent people aren't buying these new cards as much as they were hoping. I think a big part is due to people getting a new video card from 2020-2022, thus not being incentivized to upgrade to the newer generation. Other factors are that if you look at the difference (or lack thereof) in performance, many have probably just bought a previous-gen GPU used for up to half the price, with the same performance.
This is Nvidia (less so AMD, but they're not completely free of guilt) trying to keep their profit margins as high as it was during the pandemic and lack of stock (which increased demand for many buyers) while selling a cheaper product at higher prices to maintain that level of income for investors.
Starting their next-gen line with the 4090 was a good plan. That card is insanely capable (even funnier was pawning off RTX 3090 Ti right before release at $2000, ripping off early buyers before the 4090 came out. Then, 2 months after the 4090 was released, those 3090 Ti's were going for $1100 or less lol), thus allowed them the opportunity to set their own price point.
Other cards after that were overpriced, though, and thankfully AMD was there to try and be competitive in order to take customers from Nvidia's greedy lineup. But even the AMD cards are not a good bargain. Same performance with a 7800XT - sometimes less - compared to the 6800XT which is cheaper, where available.
This is currently one of the most shamelessly exploitative abuses of influence in the GPU market over the entire history of GPU sales. What makes it hurt more is we are barely removed from one of the most amazing GPU timeframes, with the release of the GTX 980, GTX 980 Ti, then the GTX 1080 series (and the 10XX series in general). MASSIVE amounts of performance vs price at the time.
Now the generational leap from the 3060 to the 4060 is -1% to 4%, with a 20-30% price jump. Wow. This is the first time that's happened.
Previously, the new generation Nvidia XX60 card would be as good as the previous gen XX70/70 Ti release. But cheaper.
/endrant.
Last edited:
I've got 30 series cards that paid for themselves, no intention to upgrade anytime soon. Pricing is indeed terrible, but even worse is that inferior connector they went with.
SilverEars
Headphoneus Supremus
- Joined
- Sep 18, 2013
- Posts
- 15,825
- Likes
- 7,327
One of the worst PC purchase decisions in my life. I was happy with my last MSI prebuilt and then bought the 12th gen Aegis. Look how badly designed the front panel for airflow. WTFk Will not ever buy a prebuilt ever again. bios came gimped when I first got it and couldn't apply XMP. I just don't like building pc, and trouble-shooting, but realizing what I end up is motivating me to build my own.
Look at the pitiful airflow intake on the front!
Look at Gamer Nexus making fun of it.... It's the reason why my CPU goes up to 90C when I tax it with a CPU intensive game, or my Nvme near the GPU/CPU is at 50C idle.
He took the front panel off, I guess I can do that to get real air-flow...
Look at the pitiful airflow intake on the front!

He took the front panel off, I guess I can do that to get real air-flow...
Last edited:
For me, the difference is noticeable. Not huge but noticeable.
Samsung 980 pro nvme ssd pcie 4.0
Sabrent rocket 3.0 nvme ssd pcie 3.0
Samsung 980 pro nvme ssd pcie 4.0

Sabrent rocket 3.0 nvme ssd pcie 3.0

SilverEars
Headphoneus Supremus
- Joined
- Sep 18, 2013
- Posts
- 15,825
- Likes
- 7,327
What is it that you use it for is noticeable? I only game and use office software only, so I don't notice a difference. I just want fast file transfer weather it's small or large. I don't do any video editing or anything all that intensive. My CPU/GPU combo is for gaming only.For me, the difference is noticeable. Not huge but noticeable.
Samsung 980 pro nvme ssd pcie 4.0
Sabrent rocket 3.0 nvme ssd pcie 3.0
![]()
One of the worst PC purchase decisions in my life. I was happy with my last MSI prebuilt and then bought the 12th gen Aegis. Look how badly designed the front panel for airflow. WTFk Will not ever buy a prebuilt ever again. bios came gimped when I first got it and couldn't apply XMP. I just don't like building pc, and trouble-shooting, but realizing what I end up is motivating me to build my own.
Look at the pitiful airflow intake on the front!Look at Gamer Nexus making fun of it.... It's the reason why my CPU goes up to 90C when I tax it with a CPU intensive game, or my Nvme near the GPU/CPU is at 50C idle.
He took the front panel off, I guess I can do that to get real air-flow...
Well to be honest, chip manufacturers squeeze any ounce of headroom they can now and prefer chips to run at 90C. Shame, I prefer a cooler and quiet running rig.
SilverEars
Headphoneus Supremus
- Joined
- Sep 18, 2013
- Posts
- 15,825
- Likes
- 7,327
The issue is throttling from inadequate cooling. When you start to tax the CPU with bad cooling, the temp hits the throttle threshold and you get lower performance. Which is why cooling is important to get every drop of performance out of a setup. Laptop performances are generally throttled due to difficult cooling situations.Well to be honest, chip manufacturers squeeze any ounce of headroom they can now and prefer chips to run at 90C. Shame, I prefer a cooler and quiet running rig.
NAND like heat, but NAND controller do not. I guess certain silicon portions prefer a level of temp to move better.
I see your point regarding more and more devices eating too much power and needing better cooling.
Last edited:
chesebert
18 Years An Extra-Hardcore Head-Fi'er
- Joined
- May 17, 2004
- Posts
- 9,354
- Likes
- 4,736
One of the worst PC purchase decisions in my life. I was happy with my last MSI prebuilt and then bought the 12th gen Aegis. Look how badly designed the front panel for airflow. WTFk Will not ever buy a prebuilt ever again. bios came gimped when I first got it and couldn't apply XMP. I just don't like building pc, and trouble-shooting, but realizing what I end up is motivating me to build my own.
Look at the pitiful airflow intake on the front!Look at Gamer Nexus making fun of it.... It's the reason why my CPU goes up to 90C when I tax it with a CPU intensive game, or my Nvme near the GPU/CPU is at 50C idle.
He took the front panel off, I guess I can do that to get real air-flow...
You could just buy a $100 case with some fans and migrate everything over you know...
The issue is throttling from inadequate cooling. When you start to tax the CPU with bad cooling, the temp hits the throttle threshold and you get lower performance. Which is why cooling is important to get every drop of performance out of a setup. Laptop performances are generally throttled due to difficult cooling situations.
NAND like heat, but NAND controller do not. I guess certain silicon portions prefer a level of temp to move better.
I see your point regarding more and more devices eating too much power and needing better cooling.
This was a general statement of trends in CPU releases and entirely true. Heck the 7800X3D has a cooler recommendation of a 280MM AIO and still runs 80C fans full blast on an open bench in TH's review. When the headroom is present modern chips often try to take it, boosting higher or throttling less when cooling isn't able to keep up. I used to overclock my CPU/GPUs to their sweet spots, now I'm routinely undervolting to keep my PC noise to a minimum without sacrificing too much (if any) performance.
Users who are viewing this thread
Total: 16 (members: 0, guests: 16)