Wasim Salman writes about videogames using short, mechanical sentences. He also sometimes builds PCs, and has written this article for us about the ups and downs of building and owning computers that are more powerful than they need to be.
It all started with a 2006 Lenovo T61 Thinkpad.
The last PC game I played with any seriousness was Diablo II years earlier.
I ran it on a stock Compaq Presario purchased at a wholesale retailer.
It was garbage, but it ran the only PC game I cared about then.
The T61 was exciting.
It was the first laptop I owned with a 500 MB Nvidia card.
I took it everywhere. I studied it.
I optimized the OS. I maxed out the RAM. I learned about overclocking.
Nvidia prematurely dropped support for the T61’s GPU.
They had already moved on.
I discovered hacked drivers.
I pushed the laptop as hard as I knew how.
The T61 became my media hub.
I began watching anime again.
I started with the future/cyberpunk/mech genre.
All fueled my visions of uncompromised technological ascension.
In 2009, I bought an Alienware desktop.
The case was beautiful. The internal cord management was elegant.
Crysis: Warhead was the first game I played.
It was stunning.
I installed every game I could.
I hit the performance barrier of the 275 very early.
One year later, Nvidia would release the GTX 500 line.
I upgraded to the 1.5 GB 580. I added 2 GB of RAM. I went from a 720 display to a full 1080.
I stuck with this setup for another year.
I was closer to realizing the full promise of my mania.
The announcement of Battlefield 3 was a phenomenon in and of itself.
Alone, it drove more than a billion dollars into the PC hardware sector.
Everyone was getting ready for Frostbite 2.
I decided I had to start from scratch.
I sold off my Alienware piece by piece.
It was a hard goodbye.
I was going to build the most excessive, powerful, absurd display of technology I could afford.
Months later, my monument stood:
- 3x EVGA 3GB Nvidia GTX 580s in 3-way SLI.
- 16 GB G.Skill Ripjaw RAM.
- Gigabyte Z68x Motherboard.
- Corsair Professional Series AX 1200 PSU.
- Intel i7 3.7 GHz.
- Thermaltake Level 10 GT w/ built-in water cooling.
- 3 TB 7200 RPM Seagate Barracuda hard drive.
- LG Blu-Ray RW drive.
- Cyborg RAT7 Albino gaming mouse.
- Cyborg V.5 keyboard.
- Cyborg FLY5 flight stick.
- Logitech G35 7.1 Surround Sound headset.
- 3x 1080p Planar 3D Vision Ready 23″ monitors.
- Nvidia 3D Vision glasses.
I was proud.
I was ignorant.
Battlefield 3 was gorgeous. It shined.
I had birthed my final daydream.
I had built a useless tyrant.
I looked forward to gaming in 5760 x 1080.
Few games worked.
They displayed the problem in one of three ways:
Call of Duty and Mass Effect had a ruined FOV: Camera zoomed in so far the games became unplayable.
Diablo and EVE: The wide resolution created errors with hit detection and HUD display.
Hawken and Tribes would just default to 1920 x 1080.
In the few years since, none of this has changed.
Programs like Widescreen Fixer help, but multi-monitor support is marginal at best.
Now, I game on one monitor and switch on all three when I am writing in order to multitask.
I don’t have the time or energy to shoehorn games anymore.
3-way SLI was never a problem until this year.
Since building this machine few major games had issue with SLI.
Fall 2014 was a step back.
Call of Duty: Advanced Warfare released without an SLI profile.
For the first week, launching the game would cause an APPCRASH error.
Nvidia would patch in SLI support a week or two later.
I download Nvidia Inspector and toy with the profile.
I get CoD: AW to utilize 2 GPUs and I leave it at that.
Civilization: Beyond Earth suffers launch errors.
I go into Nvidia Inspector again. I tweak elements of the game’s profile.
Dragon Age: Inquisition is the first major PC game I play that works on release.
But running DA:I with SLI creates strange graphical errors.
The rapid, flickering boxes of land texture give me headaches.
Solid white clouds of fog make parts of the game near unplayable.
I dig into the forums to look for workarounds.
I manage to limit the flickering.
The cutscenes grate my nerves.
The fog remains.
Battlefield 4 performance drops and I am confused.
I know I meet the requirements, even in 5760 x 1080.
Frostbite 3 is wonderful, but not the vertical leap Frostbite 2 was.
I check my GPU temps.
I check my CPU and the temperature is off.
I grab my flashlight and shine it at the front of the case.
There is no liquid flowing through the reservoir.
Coolant dripping down the front.
I do a hard shutdown.
There is a large leak from one of the plastic seams of the flow meter.
I spend the next few days trying to repair it. Nothing works.
I spend another few days removing the entire cooling system.
I order a massive Noctua CPU Cooler. I am anxious about installing it: I don’t know if I remember how to take my computer apart.
It takes time, I gradually recall what to do.
I secure the cooler and rebuild.
I remember the thrill when it all works.
The Noctua performs better than the stock cooling system.
I throw it in the garbage. I still lament its loss.
Another piece of the dream tempered by time and cleaved by the practical.
I press the power button.
The computer boots up. The computer shuts down.
It cycles every five seconds.
After a few minutes I do a hard shutdown.
I start it up again. It boots to Windows.
An hour later, it shuts off as I’m working.
Over the next few days the problem gets worse.
I know the electricity in my place is questionable sometimes.
I buy a large APC UPS. I plug it in.
Nothing changes. The computer boots up.
The computer shuts down.
I become worried.
I go through every forum I can think of.
Many suggest a problem with the PSU.
I drive to my brother’s house. I borrow his new PSU.
I gut the computer and plug his in.
The computer boots up. It stays on.
I consider this a minor victory. I shut it off.
I plug my PSU back in. The computer boots up.
It stays on.
Midnight on a Sunday and I feel accomplished.
I stand the case upright. The cycle starts again.
I laugh and I feel like I’m losing my mind.
The Noctua catches my eye.
I think about it and go to bed.
I spend the following afternoon taking everything apart.
I reapply the thermal glue to the CPU. I reseat the Noctua. The RAM. The GPUs.
I plug everything back in and test.
It boots up. I let it run. It stays on.
I feel accomplished. I turn it off.
I take it back to my office and plug a monitor in.
I watch it boot up.
A small hour-glass symbol now appears in one of the boot screens.
It’s still there today.
Neither I nor this machine can forget what happened.
I looked forward to playing games in 3D.
Nvidia’s 3D Vision system uses active shutter glasses to create detailed and deep immersion.
I was excited at the prospect of games at full-resolution and 3D.
It didn’t work out.
Turning the 3D on affected performance more than I thought it would.
I had to choose between full resolution and 3D.
And for a while I would jump back and forth between the two.
Now I don’t bother with either one.
I don’t know where the glasses are anymore.
This monument has been collapsing for a long time now.
Liquid Cooling. Multi-monitor. 3D. All cut-down.
The excess pruned away.
3-way SLI is the final remnant of my tech-driven abandon and now I have begun considering its removal as well.
GPUs have come a long way since I began idolizing the 580.
And if I am only utilizing one monitor during games, is there a point to having three aging cards?
Nvidia is now at the GTX 900 line and I can’t afford the upgrade cost to maintain this setup.
I am considering dropping the 580s and the monitors for one powerful card and one beautiful monitor.
A part of me hurts at the thought of this.
It is painful to remember the overwhelming excitement I had for this computer.
It is difficult to watch my own expectation whither away.
Time tempers everything: Dream. Passion. Ambition.
Time wears you down with its silence and failures and the fever dreams of a younger age no longer seem as important, as necessary, or as loud.
At some point you have to gut everything to move forward.
At some point you have to let everything fall.
At some point you have to trample towards simplicity.