Search Icon

Ryan Harrison My blog, portfolio and technology related ramblings

Programs to install on a New Build

Below is a list of all the software I tend to install straight away on a new build (or simply when reinstalling Windows from time to time). I am big into keeping what you have installed at any point to an absolute minimum - mainly to prevent general slowdown over time, so this list isn’t that long. These can however get pretty much anything I need done, even if extra utilities are needed later on.

Browsers: the core of your computer these days

Chrome - my main browser and has been for quite a while now. Sure it’s a massive resource hog, but what’s the point in having RAM if it’s sitting idle? Still probably the fastest browser around and the most popular.

Firefox - mainly installed as a backup which gets used every so often. The new Firefox Quantum update has improved the situation dramatically and maybe I’ll try it as my main driver if Google screws things up.

Browser Extensions: pretty much mandatory if you want any kind of sane browsing experience

uBlock Origin - ad/tracker blocker. A must have (or alternative). The web sucks these days without it. My soul dies a little inside every time I have to use a browser without some kind of adblocking - we’ve really screwed up the internet with the mountain of Javascript, popups and auto-playing videos plaguing every site.

LastPass - if you aren’t using a password manager of some kind, I recommend you revisit that decision. LastPass and their extension have been working great for me.

Google Mail Checker - displays an icon in the toolbar linking directly to your GMail account, also shows the number of unread messages.

JSONView - if you ever look at JSON in Chrome, this is a must to get some nice formatting.

Again, I like to keep this list to a minimum as Chrome starts to slowdown and consume even more resources the more you have. If you do need loads, I recommend disabling them until you need to actually use them.

Text Editors: for when you want to edit some text

Notepad++ - small, fast and feature rich replacement to the standard Windows Notepad. Great for any light text editing that doesn’t require a full-blown editor/IDE.

Visual Studio Code - probably the best editor around now after pretty much wiping the floor with Atom and Sublime. The amount of updates each month is insane and the extensions are very mature at this point. See here for the extensions I use.

Dev: tools and IDE of choice

Git - because you wouldn’t version control any differently these days now would you?

JDK - I mainly develop on the JVM (which whatever you think of Java is a great piece of tech). P.S - Kotlin is awesome.

Intellij - one IDE to rule them all. Does everything in every language, what can I say?

WSL (Windows SubSystem for Linux) - Ubuntu install for Windows for various utils.

Node - because apparently I need some way to install 3 thousand packages for a ‘Hello World’ webapp.

Media: for when you want to not do anything productive

K-Lite Codec Pack (MPC-HC) - can play pretty much anything you can ever come across and bundles in Media Player Classic which is my favourite media player.

Spotify - not much to say, does the job and I haven’t seen any need to try out any other service.

IrfanView - the built-in Windows 10 Photos app is absolutely terrible in every way imaginable.

Networking - connecting

FileZilla - SFTP client although not really needed anymore as WSL and rsync are a thing. Still small/lightweight enough to keep around.

Postman - great program to create and send HTTP requests. The de facto choice at this point for testing web services.

Private Internet Access - current VPN provider. Never had any problems with it, speeds are good and the client is solid.

PuTTY - still solid as ever even if you can use WSL for ssh now.

Games: launchers for the actual games

Steam - not much more to say about this. If it’s not on Steam I probably don’t want to play it.

Origin - because Battlefield is sadly not on Steam.

Monitoring: because you need to keep an eye on those temps after you overclock

HWMonitor (portable) - simple, lightweight and easy to read measurements across your system.

HwInfo64 - a more heavyweight alternative to HWMonitor, the number of readings it gives is comprehensive to say the least.

Misc: random tools and utilities

F.lux - remove blue light from your life.

CCleaner - still hanging around, runs every so often to delete temp files.

WinRar - yes the interface is outdated, but I only ever use the explorer context menu items. For me a staple for many years.

Again, this is just the barebones list that I tend to immediately install on a new install of Windows. Things tend to accumulate over time, but still try to keep it to a minimum.

Read More

Ubuntu Server Setup Part 2 - Secure Login

Before reading this, make sure to go over part 1 which covers initial login and setting up a new user.

In the previous section we covered logging into the server with the root user. At that point we were using a simple password, which is less than ideal. In this part we will be setting up public key authentication for the new user in order to better secure our logins. Login to the root user will be disabled via ssh as well, forcing you to go through your newly created user and use sudo commands to get root access.

Generating an RSA public/private keypair

Using Windows you can use the free PuTTygen utility which is bundled in with PuTTY.

Open the app and select SSH-2 RSA under the Key menu. Then hit Generate and provide some mouse movements to generate some randomness.


The top textbox will contain the newly generated public key which will be deployed onto the remote server. Save both the public key and private keys in a safe place. Remember you never want to give anyone/anything your private key.

The utility will save the private key in the .ppk format which PuTTY can understand. You can choose to export into the more generic OpenSSH format if needed (e.g to use with the ssh command under WSL).

Copy the contents of the top textbox into the clipboard as this will be what will be saved into the remote server in order to authorise you.

If you are using Linux you can use the ssh-keygen command instead to generate the keys.

Installing the public key

Login to the remote server under the new user you wish to secure (currently using a password although we will now change that).

If you are still the root user run su - <user>

In the home directory create a new .ssh directory which will house the public key.

$ mkdir ~/.ssh

Change the permissions to ensure that only the user can read or write to the directory.

$ chmod 700 ~/.ssh

Create a new file called authorized_keys and open using the nano editor

$ nano .ssh/authorized_keys

Paste your public key into this file. Ctrl+X and then Y to save and exit

Change the permissions on the new key file so again only the current user can read or write to it.

$ chmod 600 ~/.ssh/authorized_keys

Login using public key authentication

Now the public key is installed onto the server and you have the corresponding private key on your local machine, it’s time to login using them. In PuTTY, go to the Connection -> Data -> Auth tab and navigate to the .ppk private key in the bottom field:


If you’re using the ssh command, place the file under ~/.ssh/id_rsa and it will use it automatically. Otherwise you can pass in the path to the private key as you login:

$ ssh -i ~/.ssh/private_key [email protected]

Disable root login

In order to further secure the server, it’s best to prevent direct login to the root user. I have also changed the port for ssh to something other than 22 to prevent a lot of automated attacks and disabled password authentication (forcing you to use public keys).

$ nano /etc/ssh/sshd_config

PermitRootLogin no
Port 23401
PasswordAuthentication no
AllowUsers Fred Wilma

Reload the ssh daemon to reflect the changes

$ sudo systemctl reload sshd

With these settings active, you will be forced into logging in via the Fred or Wilma users (root being disabled) by public key authentication on port 23401.

Read More

Helpful Extensions for Visual Studio Code


Icons or Material Icons

Much needed icons for pretty much every common folder/file combination you can imagine.

File Utils

A convenient way of creating, duplicating, moving, renaming, deleting files and directories. Similar to the Sidebar Enhancement extension in Sublime Text. This again is something I see no reason can’t be integrated directly into VSCode.

Code Runner

Run code snippets or code file for many languages directly from the editor. Run the selected code snippet/file or provide a custom command as needed. Kind of surprised that VSCode doesn’t have this built in.



Rich support for the Python language (including Python 3.6), including features such as linting, debugging, IntelliSense, code navigation, code formatting, refactoring, unit tests and snippets. Definitely a must have if you do any Python development at all in VSCode.

React Code Snippets

This extension contains code snippets for Reactjs and is based on the babel-sublime-snippets package. Pretty much a must have if you any React development and use snippets.

ES6 Code Snippets

This extension contains code snippets for JavaScript in ES6 syntax for VS Code editor (supports both JavaScript and TypeScript). Very useful for class definitions, import, exports etc.



Integrates ESLint into VS Code. It can be very picky at times and suggests issues that I sometimes don’t care about, but you can get it into a decent place after some customisation.

The extension uses the ESLint library installed in the opened workspace folder. If the folder doesn’t provide one the extension looks for a global install version (npm install -g eslint for a global install).

Markdown Lint

Provides linting for the Markdown language. Includes a library of rules to encourage standards and consistency for Markdown files. It is powered by markdownlint for Node.js which is based on markdownlint for Ruby.

Code Spell Checker

A basic spell checker that works well with camelCase code. The goal of this spell checker is to help with catching common spelling errors while keeping the number of false positives low.

I only use this for Markdown files as a spell checker and it does an ok job. It’s probably the best extension that provides this functionality, but it’s still fairly limited. I wish the dev team would integrate this feature natively. You have to click on the quick fix menu (lightbulb icon) to see spelling suggestions as opposed to right clicking on the word as you would think. I guess this is a limitation of the extension framework so there’s definitely some room for improvements.


Auto Close Tag

Automatically add HTML/XML close tags. Same as how Visual Studio or Sublime Text do it so very useful if you’re used to that behaviour already.

Path Intellisense

Extension that autocompletes filenames from the local workspace. E.g typing ./ will suggest all files in the current folder. Very handy.

I’m no doubt missing a bunch of other great extensions, but I try to limit the number to keep things as responsive as possible. Visual Studio code is already a resource hog (pointing at you Electron) without a bunch of background addons making the problem worse.

Read More

A Better Alternative to Google Authenticator

2-Factor authentication is, for very good reasons, becoming increasingly popular as a way to further protect yourself online. The sole use of passwords has long been inadequate for secure authentication and so has been augmented by additional systems. A lot of online services provide SMS messages a a main method for 2-factor authentication, whereby a code will be sent to your phone. This solves part of the problem, but is still susceptible to the inherent insecurity of SMS as a whole, let alone SIM cloning and number spoofing issues.

As a better alternative, many providers have been offering the use of TOTP (Time-based One Time Passwords) to generate such codes. The protocol behind this is open, however the most popular implementation is by far the Google Authenticator app, which allows you to scan QR codes to add accounts and will constantly generate one-time-use codes as needed. Its popularity has also meant that most online services directly link to the app and include it in their usage instructions for 2FA auth.

Google Authenticator app

The Problem

The Google Authenticator app is all well and good, works well and is very easy to use. It does however open up another problem - what do you do when you lose your phone? It’s pretty plausible that for a significant number of users, their phone will either be lost, broken or stolen whilst they are using it to generate 2FA codes. What can you do when you can no longer login to many of your accounts because you aren’t able to generate the TOTP?

Many websites will also give you another security code when you enable 2-factor authentication, that you can use in this exact case. But isn’t that kind of defeating the whole point? Where are people going to store this code? You’re pretty screwed if you lose this recovery code, so you might end up writing it down somewhere insecure or store it online somewhere equally insecure. In my opinion, this is solving a problem by creating a new one.

And that’s only taking into account those sites which do offer you a recovery code. For the no doubt significant number which do not, you are locked out of your account if you lose your phone. It’s going to be in a case by case basis that some providers may let you back in if you contact them, but I’m not sure how they are going to know it’s you. For any site that stores sensitive data, I don’t see this as an option.

A Solution - Authy

Maybe a lot of users will be put off enabling 2FA for this reason, or more likely a lot of people have never really thought about the potential consequences. Either way, just like your main data, you need to also have a solid backup solution for your 2FA codes.

I mentioned before that the TOTP protocol is not proprietary - so can be implemented by anyone. A think many think that this technology is something Google have magicked up, but in reality there are a number of alternate apps out there.

One such app is called Authy, which aims to solve the problem mentioned above. In the basic sense, it is very similar to Google Authenticator, whereby you scan the same QR codes and it generates TOTP codes for you. The difference however, is that it provides a method of automatic backup of your accounts. In a similar manner to conventional password managers, such as LastPass which you should definitely be using, Authy will encrypt and upload your account strings up to their servers when you add them to the app. This is tied to a password you specify, which they don’t ever know - so if you trust password managers then this should be no different.

Your account itself is tied to your phone number, so when you lose you physical device, you can recover all your accounts as long as you move over you number. There are also features which allow sharing of your accounts to your other devices in a similar manner.

Authy app

Yes, I know you can just screenshot the QR codes which are generated, or add them to your other devices at the same time, but this is putting all the pressure of the backup on the user. Where are you meant to store the QR codes (how do you backup the backup?), will you encrypt them, how are you going to keep them in sync etc? Again, in this case you are solving a problem by generating another problem - for yourself.

It’s not perfect

The app isn’t perfect. For such a simple set of use cases, I have no idea why the app misses on some key features to make it more user friendly (and more approachable over the Google offering).

  • You can tie your accounts to a predefined set of providers that the Authy developers maintain (e.g Facebook, Google, Amazon etc). By doing so you can get a nice looking logo and some customised colours for your troubles. This does make the app look a lot nicer, but you rely on the site being in the set that the developers give you. Why the hell can I not provide my own logo? Why the hell can other users not upload their own customisations? Why the hell isn’t the existing set bigger? I mean seriously, the look and feel of the app is one of the main selling points given by the devs themselves, this should be so easy to add and contributes to one of your main features. The Google Authenticator app does look bland in comparison - but only when I don’t have to use the crappy ‘other account’ template.
  • You can rename your accounts to what you like, but this name doesn’t seem to be used when you choose the grid view. Why? Do you think I changed the name just for fun? If I changed it then it’s because I want to see it. The changed names are even used in the list view!
  • The QR scanner isn’t great. I mean, it’s definitely functional for sure, but it’s nowhere near as good as the one used in the Google Authenticator app. You have to really line up the code in the camera and get it into focus for it to work. In the Google app I can just point it somewhere close and it picks it up immediately.

For sure I am knitpicking with these annoyances, but if you want to draw people away from an app provided by Google, then you’re going to have to get it completely right. Hopefully the devs can get on top of this, because for me the main selling point - automated backups - works very well. For most users I would still definitely recommend the Authy app (or others which offer similar features) over the Google Authenticator app.

Read More

New PC Build

Late last year I finally got around to buying and building my new computer after many months of research, waiting for releases and price monitoring on PCPartPicker. It was definitely massively overdue as I was running an AMD Phenom II X4 955 (3.2ghz) on an AM3 board for the preceeding 7 years! It’s age was definitely beginning to show, whereby new AAA games would be massively CPU bottlenecked and Battlefield 1 would hardly run at all due to missing some modern instructions. Not to mention how a Youtube video running in the background would intermittently freeze when doing basic work in IntelliJ.

My overall plan and thought process was to go full out on the new parts - which should hopefully last for the next years years easily. I could also make use of one of the best things about building your own desktop computers - reusing old parts to save money. Follows is an explanation of each of the parts I chose and how they have been performing ~1 month after the build:

CPU - Intel Core i7 8700k @ 4.8ghz

Intel Core i7 8700k

Starting with perhaps the main part which ultimately determines the platform you will need - this one changed significantly over the course of last year. With the very successful launch of the Ryzen series of processors by AMD last year, I was initially planning on getting a Ryzen 1700 (8 cores/16 threads) as for raw price/performance you couldn’t (and still can’t really) do much better. The launch didn’t go without a few hiccups, mainly around memory compatibility on a new platform, but it seems to have gotten a lot better. With a very significant ~50%+ improvement to IPC, AMD are finally able to compete with Intel again in the CPU space. The problem however is the clock speeds are still very limited compared to their Intel counterparts. They provide a staggering number of cores for the price point, but the vast majority of users are never going to utilise them all unless you are a hardcore video editor etc. For me personally, having 16 threads isn’t all that important when compared to clock speed - which undeniably still causes the biggest performance difference in todays (mainly single threaded) applications. The low powered 14nm process used in the Ryzen processors can barely break 4ghz even with an overclock, although hopefully this will improve with the next generation on the 12nm high powered process.

Regardless of the low clocks, I was still planning on a Ryzen build until Intel obviously felt threatened by AMD and significantly brought forward the release of their 8th generation Coffee Lake CPU’s. The rumours were that they were adding cores whilst maintaining their high clock speeds so I decided to wait until they were (paper) released and see how they performed in the reviews. And man am I glad that I did, because the high end parts in particular destroy Ryzen in most workloads. I was always focused only on the 8700k which has 12 threads (more than enough for me) and at stock clocks at 3.7ghz but turbos all the way up to 4.7ghz on one core. According to Reddit, most Ryzen 1700/1700x can overclock to around 3.8ghz max. That ~1ghz+ delta in clock speeds is very substantial and means the 8700k can still keep up in multithreaded workloads even when it lacks 2 full cores.

The release of Coffee Lake wasn’t without it’s problems either though. Global stock of the 8700k in particular was extremely short, probably because Intel hadn’t had enough time to manufacture them after brining forward the release date. As such, the prices were massively inflated initially due to poor supply. After waiting out the initial rush I did manage to get my unit for a reasonable price - even if I did have to order it from the Czech Republic. I did pay more for it, even compared to current pricing, but I still think it was worth it and I was pretty desperate to upgrade last year!

The only thing I can really say about the 8700k is that it’s a complete beast. It doesn’t take a lot to be a significant improvement over my last system, but the 8700k chews through any workload that I can throw at it without hardly breaking a sweat. Games are completely GPU bottlenecked again (as they should be) and overall performance is excellent. I’ve currently dialled in a 4.8ghz overclock on my CPU across all cores which is pretty mad really. I think I also got a golden chip as well because a 5ghz overclock was also possible at reasonable voltages/temperatures. I’m still playing around with the overclock though so more to come on that front. If I can get a good 5ghz overclock though, that’s a mad amount of performance on a 6 core chip. We will see what happens around the whole Meltdown and Spectre thing, which looks like it might impact performance by a couple percent, but overall I definitely recommend the 8700k. Hopefully AMD can once again catch up though with Pinnacle Ridge and then Zen 2 which should promise much higher clocks. For the meantime though, the performance crown still belongs to Intel.

CPU Cooler - Noctua NH-D15S

Noctua NH-D15S

The 8700k runs hot and that isn’t an overstatement. There has been a fair amount of controversy online about the bad TIM (Thermal Interface Material) that Intel uses to combine the CPU die with the heatspreader and there are also mentions of air gaps between the two causing issues. Why they choose not to solder like AMD have with Ryzen I don’t know (although I’m sure there are reasons above just cost saving), but the result is that the 8700k runs hot and requires top end cooling to keep under control - especially if you also want to overclock and it’s a K-series unlocked chip so you should want to (P.S the i7 8700 is pretty great price/performance if you don’t want to overclock).

Most people with the 8700k are using all in one liquid CPU coolers or even custom loops, but using water in a computer still seems strange to me and I don’t particularly like the idea of the pump suddenly dying and the increased maintenance required. Luckily however, there are now air coolers available which, although might look worse, offer similar performance to water cooling whilst being cheaper and very quiet.

It didn’t take much research to find that Noctua is the clear winner in this department. Their coolers are very well manufactured, perform brilliantly and also use their own fans which are already some of the best in the market. Put all that together and you get something that can easily tame even the 8700k. I eventually chose the NH-D15S dual tower cooler over the very similar NH-D15, which although only includes one fan, still performs very similarly and is slightly less large.

The Noctua coolers aren’t cheap by any means, but I think it’s definitely worth the price. The packaging is great and their unique mounting system is probably the best out of any manufacturer. At idle, my 8700k barely breaks 30C (it downclocks to 800mhz) and at full load doesn’t go much above 80C even when running Prime95 (which is the worst case scenario that won’t be met in every day use). The best thing however, is just how quiet it is. At idle I can’t really hear it at all and even at full load it remains surprisingly quiet considering how much heat it manages to dissipate. Again, I would definitely recommend these Noctua coolers, just make sure you have enough room in your case to accommodate them.

Motherboard - Gigabyte Z370 Gaming 5

Gigabyte Z370 Gaming 5

The Z370 chipset is the flavour of choice for Coffee Lake at the moment pending the lower end chipset releases early this year (although for an 8700k you really want a pretty high end Z370). I ended up with the Gaming 5 as it had some good reviews and has a well rounded feature set for the price point. I also got £20 worth of Steam vouchers through a promotion offered by Gigabyte and am about to get another £20 through another promotion to leave a review. In real terms that makes this board excellent value for money.

It’s a very solid board and I have no complaints so far after a month of good use. The VRM’s are some of the best at this price range and easily support my 8700k running at 4.8ghz (and event at 5ghz) with good temperatures - something which cannot be said of some other cheaper Z370 motherboards. No issues setting up with an NVME drive either (which can also be placed above the graphics card not only below for better thermals).

Overall connectivity is definitely a strong point as some competitors seem to be lacking in USB ports on the back panel. The inclusion of a USB type C port on the back plus a header is also a nice to have to future-proof yourself. AC WiFi is also definitely a good selling point (and is interestingly missing from the Gaming 7 model) and works as expected for those of us who are unable to have wired connections.

The BIOS is nothing outstanding, but has all of the settings you could pretty much ever need. The XMP profile on my RAM kit was easy to enable and runs at 3200mhz without issue, overclocking is also straightforward with multiple guides available if you need pointers. It’s good to see Multi-core enhancement (MCE which auto overclocks k-series processors to max turbo across all cores) turned off by default as it should be - something which cannot be said of the Asus boards. Fan control is very easy and the board has plenty of hybrid fan ports - which is great to see for complex watercooling setups.

Build quality is good and the reinforced PCI-e slots are nice for heavy graphics cards. My favourite feature has to be the ALC1220 audio though which (coming admittedly from poor onboard audio) sounds fantastic in comparison.

I’m not that into the whole RGB lighting game, but this board definitely suits those who are, as there are plenty of lights scattered all over. There are also options for adding additional lighting strips if that’s your thing. Everything can be configured both in the BIOS and in the extra software (including turning it all off if needed) but my case doesn’t have a window so I don’t see it anyway.

Overall at this price point this board is a very solid all rounder and I would recommend to any prospective Coffee Lake buyers. The more expensive Gaming 7 is also an option which includes very beefy VRM’s and better onboard audio. For me though these features weren’t worth the extra money and the loss of WiFi/Bluetooth.

Memory - Corsair Vengeance LPX 16gb @ 3200mhz

Corsair Vengeance LPX 16gb

RAM prices at the moment are crazy. Monitoring the pricing of this kit via PCPartPicker showed multiple price hikes over the course of last year which now put this kit at over £200. I got it for a bit cheaper than that, but it definitely hurt a bit. Hopefully the situation improves as the new NAND factories open this year (and maybe some investigation into possible price fixing).

The kit itself is pretty standard and nothing really to write home about. It has a plain black look and wide support across many motherboards. I’m not interested in fancy RGB memory or large heatspreaders, so it fits my build well.

16gb is the sweetspot at the moment with 32gb being incredibly expensive and unnecessary for most workloads. Meanwhile, 8gb is starting to become too little in some modern games and applications. I don’t expect to need additional memory in the near future. The speed however is something I was willing to pay more for. The difference between stock DDR4 2133mhz and 3200mhz can be quite substantial - even more so in Ryzen due to Infinity Fabric, but also makes a difference in Intel systems. I think 3200mhz is currently the max I would recommend whilst staying reasonably priced and easy to apply as an XMP profile in your motherboard. I had no issues enabling it and run at the rated speed in my system. Moving forward I would definitely stay above 3000mhz for any new builds, and ideally settle at 3200mhz+ to get some future-proofing.

Graphics Card - MSI GeForce GTX 960 2G

I’m currently reusing the GPU from my old machine and yes, I know this card is massively underpowered considering I am pairing it with an overclocked 8700k. It definitely starts to struggle a bit in some modern games, but I only play at 1080p 60hz anyway so it does the job for the time being. Nevertheless, I can still maintain reasonable frame rates in most games at high settings. The fact that the fans only start spinning when load is applied also means that the build is virtually silent at idle.

I was planning on upgrading the GPU at the same time (to a 1070, maybe a 1080), but I didn’t see the point as these cards have already been out for multiple years now and at the rate the industry is moving, will be obsolete when the next generation gets released. On that note, I expect Nvidia will be releasing their new Volta (or Ampere?) cards at some point this year, so I will likely upgrade to one of them. Hopefully crypto mining doesn’t inflate the pricing too much. We have already been teased about Volta with the new Titan V, so we should expect a decent performance bump with the new models.

Storage - Samsung 960 EVO 250gb NVME Drive + Crucial 256gb SATA SSD & 3TB Western Digital HDD

Samsung 960 EVO 250gb

I really wanted to get a good M.2 NVME drive for this new build and I settled on the popular Samsung EVO lineup. They are very expensive so I only got the 250gb model, but this drive just holds the OS and applications so it’s more than enough. This drive is blisteringly fast. It took barely 1.5 minutes to install Windows and to be honest pretty much everything loads extremely quickly. Boot times are also pretty crazy even compared to using a SATA based SSD. It’s almost definitely overkill for me, but I love it nonetheless and would recommend if you are an speed enthusiast and have the budget.

In addition to the NVME drive I also took my old SATA SSD and spinning hard drive from my old system. The SSD holds games and the HDD is the main data storage drive. This configuration works very well I think. It isn’t unreasonably expensive and still gives you a good overall amount of storage and great speeds. The NAND shortage hasn’t seemed to effect the 960 EVO drives too much either which is good.

Case - Fractal Design Define S

Fractal Design Define S

For some, choosing a case can be one of the most tricky parts. Personally however, I settled on the Define S very early on. In terms of looks it’s a very no thrills case (even more so because I chose the windowless model), but the build quality is great, it’s easy to build in and best of all it’s very cheap for what you get.

The packaging the case came in was good, with very little chance of damage during transit and the included manual is nicely detailed to make building within the case very simple. It’s clear that each section has been thought out well and it definitely shows in the generally excellent reviews it gets. There are a number of similar models by Fractal Design as well including the Define C and variants which include windows.

A couple of things to note include the fact that the case does not include space for any 5.25” drives. Initially I thought this to be a downside, but really it doesn’t matter a lot to me as, to be honest, I can’t remember the last time I used the CD/DVD drive in my old machine. The bonuses of removing the cage mean that the interior of the case is extremely roomly with plenty of space for large watercooling setups and good ventilation for air coolers. There are plenty of good cable management holes which make a tidy system relatively straightforward. The case comes with two 140mm case fans which are extremely quiet and perform well.

Overall, I think this case definitely lives up to the reviews. It would be nice to have an enclosure around the PSU to hide some of the cables (more of an issue for those with the window), but considering I got it for less than £60, I think it’s great.

Power Supply - EVGA SuperNOVA G2 650W 80+ Gold

EVGA SuperNOVA G2 650W

Nothing too fancy in the power supply department. The EVGA SuperNOVA G2 however is an 80+ gold rated unit with great reviews (especially from JonnyGuru) and should be solid for many years to come. Interestingly, there is an updated G3 variant, but it doesn’t seem to be particularly popular here in the UK, with many retailers preferring to stock the G2 model. The unit also has a mode whereby the fan will only turn on when needed (similar to most modern GPU’s) which makes the system quieter. There are a number of other models for those who need more wattage. I did consider the 750w model, but the 650w is cheaper and should be more than enough for me with a single GPU. Being gold rated also means it should stay efficient even when drawing near the top end of the rated wattage. Remember - never cheap out on your power supply.


Overall, I’m very happy with the build. All in, the parts above (excluding those parts which I am reusing) came to ~£1100 including delivery costs which, when considering the performance it gives, is good value for money. You can easily spend considerably more on a prebuilt with lower quality components and less overall performance.

It still seems like building your own system would be a hard thing to do, but these days the process is rather simple. The hardest thing is selecting your components, but there are so many guides and sources of help online for this that you should end up with compatible components if you have any sense. The actual process of building has become significantly simpler of the years and these days literally just consists of plugging everything together. With the number of Youtube build guides to follow, again building your system should be open to everyone.

As is (excluding a GPU upgrade) this systems should remain performant for many years to come. Hopefully this time however I will get around to upgrading it before the 7 year mark.

PCPartPicker for this build

Read More