Total Biscuit – Hamilton’s Great Adventure

By Quintin Smith on June 5th, 2011 at 12:16 pm.

Ruined.

WTF is Hamilton’s Great Adventure? I couldn’t tell you, but bubbly UK games commenter Total Biscuit can and he’s waiting for you just after the jump in this week’s Spotlight On Biscuit. Also, I finally giving in to a veritable mosh pit of popular demand and post a few thoughts on the Custard Cream.

Did you know the Custard Cream has been awarded Biscuit of the Year twelve times? Us Brits, we’re mad for the Custard Cream. Look at these guys. Mad.

Any Americans reading this? You know how you guys have the Oreo, right? I’m gonna lay it out for you right here- the Oreo is a joke. When you have that creamy filling, you don’t want soft, pathetic, limp-handshake biscuit on either side of it. You want something with crunch to offset the filling, so you get some variety don’t end up with a big calorie-rich mouthful of goo. That’s what the Custard Cream offers. Crunch. And that’s why it’s the better biscuit.

__________________

« | »

, , .

95 Comments »

  1. jon_hill987 says:

    The Custard Cream is only below the Dark Chocolate coated HobNob in my opinion. Though I also like the Custard Cream’s chocolatey cousin, the Bourbon.

    • Meat Circus says:

      Custard Creams are wank. BISCUIT FACT.

    • Man Raised by Puffins says:

      I’ve always found that there’s something not quite right with Bourbons, a certain deficiency in the biscuit portions that I can never quite pinpoint. Custard Creams, however, are truly magnificent and sit alongside Milk-chocolate Hobnobs, the Prince biscuit and Oatmeal cookies in the Pantheon of biscuit Gods.

    • mousearmy says:

      Nonono good sir. The chocolatey analog of the magnificent Custard Cream is the rich and decadent Romany Biscuit, not the sickly Bourbon.

      http://www.youtube.com/watch?v=XZSHdQdfOB4

    • ColOfNature says:

      HobNobs are under boycott until we get an abject apology from McVitie’s for ripping off Beardyman. And if you knew how much I like HobNobs you’d know how much this pains me.

    • Berzee says:

      Pssh, eat the biscuits. I don’t think “haha beatbox cooking I made this one video” is a specific enough idea that anyone should pout about it being recycled.

    • DiamondDog says:

      Meat Circus, I’ve read a few of your comments that express an opinion I don’t agree with. You seem to hate the things I love most. I have come to realise that we must be very different people. That’s fine though. We can’t all be the same, right?

      Well today, sir, you have gone too far. How dare you speak in such a way about a fine, fine biscuit.

    • amandachen says:

      No love for Ginger Nuts?

    • MiniMatt says:

      Yes, the ginger nut is our one true Lord Biscuit. A perfect dunker, sturdy enough not to disintegrate in your tea like lesser biscuits, a tangy ginger hit, satisfying crunch and just enough sugar to risk type 2 diabetes if you eat a pack a day for a good year on end.

    • Dworgi says:

      I’m addicted to Custard Creams, but only when dipped in tea. Dry, I find them bland, but the biscuit part absorbs the tea better and becomes wonderfully soft in the process. I used to believe that Ginger Snaps were the best tea-dipping biscuit, but was converted when work had run out of them and I had to push the boat out.

    • anonymousity says:

      Every biscuit ever created is really just the poor cousin of the melting moment.

  2. Dana says:

    Too bad this game requires W7 and Dx11 GPU.

    • jon_hill987 says:

      Read again. Vista has DX11 support and it only needs a DX10 GPU as the 8 series GPUs are supported.

    • sassy says:

      Not that it matters to me but still no XP support.

      Seems an odd choice to not make this DX9. Cutting out a significant amount of the market for a game with such low graphical needs. It gained nothing by being DX10, in my opinion the only reason it needs it is due to odd programming choices by an unskilled developer(not saying bad programming … but it is a game so it most likely is bad programming)..

    • Kaira- says:

      @sassy

      Well, I guess the devs thought that the DX10-capable market is large enough to bring enough income when compared to supporting a 10-year-old OS.

    • Dana says:

      @jon_hill987
      Dont blame me that game dev said it requires dx11 (yes I know he fixed it later).

    • Dana says:

      @Kaira-
      Well they lost one sale at least :D

    • Kaira- says:

      @Dana

      I’m not saying it’s the decision I might’ve made, but Win7 and Vista (32 & 64-bit) make up 72.55% percent of Steam’s users. So there definetly is demographic for DX10 and up-games. And by leaving out XP, you cut out one chunk of development expenses (and well, potential customers too).

    • lhzr says:

      the lack of dx9 support is hilarious. especially for an indie game.

    • TotalBiscuit says:

      I for one am glad that some developers are taking the bold step not to be hamstrung by stubborn gamers who refuse to upgrade their decade old OS.

      How much whining have we had recently that consoles are stifling the capabilities of PCs and yet some of those same people are still using Windows XP and being the demographic that’s holding back wider adoption of advanced DirectX capabilities?

    • johnpeat says:

      Problem is that XP ISN’T holding back games – there’s almost nothing in DX10/11 which makes games any better, it just makes em a bit prettier…

      In most cases, developers “lock out” people, not because of some amazing feature that their game really, really needs – just because either

      a – they/their publisher did a deal with MS to promote DX
      b – they used a development tool/engine/other thing which doesn’t support the older DX

      Analogy: It’s like being told your car can’t be repaired because the garage threw-away it’s non-metric tools or that you have to buy a new house because sofas are now only made 2″ wider than your door…

    • TillEulenspiegel says:

      Fun facts: OGRE and Unity, two of the most popular engines for indie 3D development (eg, Torchlight uses OGRE), are both still using Direct3D 9. And OpenGL, of course.

      There’s really no good reason to use D3D 10/11 unless you’re doing super-fancy, pushing-the-GPU-to-its-limits stuff. There’s also very little reason to write straight DirectX code at all. It’s not a particularly nice API, unless you really want to handle all the gritty details yourself.

      It also completely locks you into Windows, when you could also be releasing to MacOS, iOS, Android, etc. with relatively little effort if you’d made a different choice.

    • deanbmmv says:

      From the dev’s themselves;
      “The game is built on our high end engine, BitSquid Tech. BitSquid Tech supports DX11 and the next generation of hardware, so we have chosen to focus more on the future. I think over 80% of the Steam population have computers that can run DirectX 11 and that figure will increase a lot in the next 12 months.”
      We asked them about the lack of XP support, that was their response.

      I feel 20% is a fair bit to cut out. But if it means you aren’t having to continue with legacy techniques and can focus on a growing market, then go for it. XP share is only going to get smaller n the amount of DX10 capable machines getting higher.

    • Dana says:

      @TB
      How Harmstrung ? This is an indie puzzler not Crysis 2 (which works on xp) or yet-another-spectacle-fps. After watching your preview, I decided to buy it for my Mom, so she could play it at work. But unfortunately we got only computers with XP in the company. Well my point would be, that, in my humble opinion, the genre, target audience and gfx quality, doesn’t justify releasing it under W7, given the fact that more then 50% of market net share still belongs to XP.
      Does that make any sense ? And Im not “whining” here, I respect their decision, its just my opinion that it was wrong.
      (I got Windows 7 myself).

    • sassy says:

      The game is a niche title that uses dated graphics (not bad but dated), why would they cut out 1/5 of their target audience to gain graphics features they barely use and save anything in development costs?

      Unless they are planning on using the engine on other games then they should have been focused on the largest, most profitable market, on PC that is DX9 as DX10/11 are both compatible with it and you have a greater chance running it on Linux and Mac. Really for a game like this they should have thought cross platform and made it easily portable to XBLA or iOS.

    • lhzr says:

      if people upgrade their OS to vista or w7, they do it for dx11 features. if they do that it means they got high end cpus and gpus that can handle those features. they pay that sort of money because they want to look at fancy graphics. these are not the people that play indie games, they play crysis.

      i’d have thought this much would be obvious.

    • deanbmmv says:

      @lhzr:
      Not really, if you want a PC within the past few years it’s generally going to be Vista or Win7. XP isn’t even sold anymore. People don’t have Vista n up because they want to run Crysis, they have Vista and up because it’s all you can now buy.

    • Mctittles says:

      It amazes me how well Microsoft’s plan to force people to buy their newest OS worked. Instead of getting rightfully upset at the move to not allow DX10+ to be installed on an older OS there are people praising the move. It’s a pure marketing move that cares zero about the consumer and little about the end product, but their marketing wins over and is regurgitated by those who know nothing about the underlying “tech”, but just agree with the crap they have been shoveled.

      It’s not only consumers and kids commenting on this either. People in the professional field are sold on this crap as well. I suppose if doctors can recommend a certain brand of medicine based on marketing and/or kickbacks it works the same way with journalists and game companies.

      The truth of it is though. DX10+ is NOT necessary at all. Nor is owning Vista/7 to run said DX. This is proven by people who managed to hack DX10+ to work on XP. There is nothing in Vista/7 that DX10+ requires to run (except for what Microsoft put in to prevent XP installation). Vista is still slower than XP and like what they have done over and over they just released a half thought graphic update with loads of problems. Seems like all is forgotten though with some quick bug fixes and a relabeling to “Windows 7″.

      I can understand kids and the like getting sold on DX through Microsoft’s marketing, but developers and journalists? Come on. Does programmers actually learn anything for themselves anymore or just listen to whatever the marketing tells them?

      Case in point: http://unigine.com/download/
      This “demo for DX11″ gets passed around all to show off what you get with the API, although if all it does is prove the point how unnecessary it really is. Here is why:

      1. It runs on OpenGL.
      Ok, case closed. You can run this in OpenGL mode on windows 98 or linux if you want to. Looks the same. Runs the same? Well, no it runs a little slower for me at least, but I’d say it has more to do with poorly written shader programs than OpenGL itself. I mean the title of the .exe is “Heaven DX11 Benchmark 2.5″, so if there wasn’t any money changing hands for this they popped a chub for DX11 somewhere and I’m inclined to say there is bias involved. But hey that’s not the only reason to call this rubbish…

      2. Tessellation?
      THIS is what everyone is talking about. Good old tessellation. But what is tessellation? Anyone? Ok, I’ll give. Tessellation refers to subdividing polygons into more polygons to create a higher polygon model. Take a four point rectangle, draw an X in the middle and you get 4 rectangles instead of one! Neat huh? Well, not really. The question is why? Why draw a low poly model and let the cpu spend more work splitting it up into a higher poly when you can just start with a high poly model to begin with? I’ve heard one argument for this: “because it uses less memory”. Where does it use less memory? Not the graphics card. By the time it’s in the card it’s already has more points added to it. So where? The hard drive? I suppose but unless you are worried about mere kilobytes of hard drive space it’s a pointless optimization.
      Then you have to think about texturing. The only way you are going to be able to make textures look ok on the high poly model is to design it on a high poly model to begin with. So you make a high poly model; texture it; reduce it to a low poly model; feed it into the game; let the cpu turn it into a slightly different high poly model; and then apply your texture. What? Why?

      3. Stairs etc.
      So according to this DX11 demo, without DX11 you cannot have stairs in games. If you toggle it off they are flat planes. Amazingly people are eating this garbage without question. Of course you can have stairs (and rooftops, and bricks) without DX11. So why do they make it appear you don’t get the effect without it?
      How are the stairs made. Well, it’s a two step process that starts with tessellation. First the polygon is split into more polygons on the cpu. Next they apply a displacement map (http://en.wikipedia.org/wiki/Displacement_mapping) to it to create the 3d model that the graphics card works with in the end. Instead of just making stairs and textures for the stairs the program goes through all these extra steps and uses more resources to accomplish what games have been doing for years. Not only does it make the game run slower, but it looks worse. Textures don’t always line up and are blurry and mushy on the edges.

      In all honesty the API is of a lot less significance today than it was so many years ago. Today everything can be done with shaders. A “shader” refers not to shadows or something like it sounds, but is just a small program you write for the graphics card. So if I wanted to do tessellation I could just write a tiny shader program to split up polys for me. There are quite a few out there already written so I could just copy those too.
      The only reason I can think of using DX11 is because I’ve heard it’s more efficient than DX10. In the way Windows 7 is more efficient than Vista because Vista was so damn slow.

      This is what you get when someone spends millions on marketing something before actually having a useful product. A turd in a bag that sells like hot cakes.

    • psyk says:

      Do you get angry at everything you can’t use?

    • lhzr says:

      do you need to defend every purchase you’ve made?

      some people get a console only to start telling everybody over and over how the games run and play so much better on their system than on pcs. they’re the same people that keep fueling the xbox vs ps3 disputes. they are god damn annoying.

      ok, so you’ve bought w7 or an xbox or an iphone. good for you. no one cares how you spend your money. stop being so defensive about your purchases. it’s lame.

    • Nic Clapper says:

      A lot of people seem to think that directX is what is pushing tech forward. Some things on that..

      For one DX isnt even needed at all. Its pretty much a bunch of preset functions that devs can use — and not only can you make your own w/o DX, you could possibly make them better and more efficient. For twosies — if anything DX holds back the progression. Instead of devs coming up with their own innovative ways of doing things, they just use whatever DX has to offer, and in the process not only making a lot of games have similar dressing but also making the enduser have to have systems compatable with DX#.

      Hey maybe that can be easier for some devs — thats fine but thats NOT what I’m talking about. Its about the misconception that the higher the DX# the more advanced something is…it just doesn’t make sense once you actually know what DX is. Whats ‘new’ and ‘better’ should be based more on hardware if anything, not the version of DX being used.

    • stahlwerk says:

      While I haven’t yet directly programmed in directX 9,10 or 11, from what I have heard is that a lot of cruft got deprecated in the move to 11, leaving it lean and easy to program against. Having dealt with Apple’s laughable OpenGL stack, I’m all for platform providers pushing towards modern graphic systems. It’s not always about looks and capabilities, to the common engine programmer stability of the API and modern features are a lot more important.

      Also, don’t be hatin’ on the tesselatin’. Being able to increase polycount selectively (i.e. on the silhouette of objects) in hardware is much much much preferred over using a very-high-res polygonal mesh from the beginning, when all you care about are correct seams between displacement maps. Why waste precious bandwidth with vertex transfers when you can just tell the GPU to do it’s thing with the data that’s already there?

    • Raiyan 1.0 says:

      It’s amazing how people think that there is no reason for people to upgrade from XP. It’s not just for the DX11 compatibility, you know. MS had absolutely no consideration for internet security when writing the architecture of XP, and 7 is light years ahead in that respect. I had a multitude of antivirus and anti-spyware software running when I had XP, whereas for the past one year 7′s Windows Defender has been handling that load. I’ve yet to be infected once, something I can’t say about XP.
      Yeah, XP runs great so long as you create a seperate non-admin user and use only that account for just about everything, otherwise your security model is not all that different than Windows 98.
      Windows editions based on kernel 6 (Vista, 7) runs everything as a standard user unless you explictly tell it otherwise, has a much better display stacks/drivers with WPF and WDDM, and native support for nifty SATA features like NCQ. On that last point, Windows 5 (2000, xp, server 2003) basically does IDE emulation over sata by default.
      Also, you may have fond memories of messing around with VESA drivers or buying “special editions” of games that would only run on specific video hardware, but nobody wants to give up their unified APIs and go back to that era.

    • Wulf says:

      @lhzr

      Or, you know, they could be defending a good OS on the simple merits of, hey, it actually being a good OS. But don’t let reasoning get in the way of defending yourself from progress. :p

      Really though, it’s so much better than XP that it’s like comparison Windows XP to Windows ME. It’s just that much better. The taskbar is incredibly helpful with organising windows (in ways that XP users couldn’t dream of), being able to switch to the desktop (introduced in Vista) is an oversight in XP that was never addressed, the newly optimised Aero is really nice visually, and, my favourite feature, it has a full screen magnifier.

      It’s nice to know that Microsoft has my back in that regard. Otherwise I’d have to have given up parts of PC gaming, because we all know how many PC games just love being completely inaccessible, not caring about accessibility, and screwing over anyone with a disability (be it colour blindness, hearing difficulties, sight difficulties that can’t be solved with glasses, or what have you).

      Really, PC games are horribly inaccessible a lot of the time, and if it wasn’t for Win 7 and its full screen magnifier then I’d be screwed. A part of the problem I admit is that my LCD monitor has a really high native resolution, but if developers thought of things like font sizes (or having the font sizes obey the font settings of the host OS), and making things clear and visible without visual obfuscation, then it wouldn’t be an issue.

      Even Terraria, due to my native resolution, has ridiculously tiny text. Windows magnifier really saves my arse there when crafting and sorting.

      See? There are perfectly good reasons to defend it not as a purchase, but as a far, far, far superior OS to XP by its own merits. XP is outdated, ancient, and it’s becoming worthless. Eventually you’ll all have to upgrade… and quite frankly? 7 will make you happier than XP.

    • daf says:

      @Mctittles and others
      It’s been 3 years since the last update to windows xp (it was sp3 btw), an OS that’s now a decade old. I can understand people’s attachment to it, for someone coming from win9x it was a massive improvement, as for me I’d been using windows 2000 pro so XP was little more then a bloated update I only bothered changing to when apps started to cut windows 2000 support. XP is 10 years old and it’s now it’s turn to get support cut, and let’s be reasonable, do you really have anything running on your computer that is that old (emulators and virtual machine excluded)?

      Developers are lazy by nature, everything that can be done in DX11 can be done in any OS (even without one) as long as the programer wrote the code for it, but that’s the problem. Do you think a game developer writing his new shinny engine wants to spend his time re-implementing things that dx11 gives him for free just so he supports a dead OS (no longer sold)? Not to mention sticking to dx11 gives a nice clean base of assumptions of hardware capabilities to work of from afaik.

      Forced or not it’s time to move on, Windows 7 is Vista Second Edition with all the issues worked out, it has all the nice things XP had and adds a few like not giving you a blue screen when your graphic driver crashes, better support for modern hardware, somewhat improved security, etc Nobody expects developers to support windows ME (released in 2000), so it shouldn’t come as a surprise that XP (released 2001) is now getting the same treatment, if anything it came unusual late, probably due to the development and PR disaster that was vista…

    • JWendin says:

      XP is at its end-of-life. There’s no reason whatsoever for anyone building an entirely new engine to support something Microsoft themselves won’t support in the very near future. Cutting out all the legacy fixed-function cruft is very liberating for a developer.

      This reminds me of the people holding on to Windows 98 back in the day. Embrace change, it’s for the good of everyone. :)

  3. Scatterbrainpaul says:

    Oreo’s make better milkshakes, a custard cream milkshake would be pretty awful

    • AndrewC says:

      What will finally bring America down is turning everything into milkshakes.

    • Berzee says:

      I tend to dip any baked good that isn’t A) Bread or B) American Biscuits in milk — either briefly or until they disintegrate depending on my mood. Not exactly a milkshake…but certainly something I am persecuted for even in This Great Nation.

    • RedWurm says:

      I actually find oreo milkshakes rather unpleasant. Possibly better than a custard cream milkshake, but certainly not something I’d go for. And I’m quite the enthusiast when it comes to things being milkshook.

  4. johnpeat says:

    As someone offering opinion,, using expressions like “I don’t know why” and “No idea about that” every 30 seconds-or-so isn’t confidence inspiring – biscuits need research and so do you!

    My mum also said arguing with yourself is a sign of madness – although your support of the custard cream at least partly ameliorates that.

    If you’re going to make a 30min video tho, you need to practice the Rob Brydon more – that said, it’s quite funny the way it fades away as frustration and tiredness clearly set-in.

    Semi-seriously – where it really goes wrong is where you try to decide what people will and won’t like/be annoyed at – part of the role of the reviewer is to tell people what a game does and let THEM decide that isn’t it? I found the need to redo the puzzles in HGA a bit wearing after a while – I’d have liked some checkpointing earlier on/a time rewind feature perhaps.

    • sassy says:

      You realize this is a first impression don’t you? TotalBiscuit hadn’t played the game until filming and therefore has had no time to think about gaming mechanics nor even what he will say. All you are getting is commentary and quick thoughts that he is having at the time. You cannot call this a review and it is unfair to expect the same things as a review, this is undeveloped thoughts as they are happening.
      As for telling you what you will find annoying, I don’t think you were listening very closely. He did state some things that were annoying and somethings that he believes you will find annoying but he never told you that you will find it annoying … except with the camera I believe he did (and admittedly I was getting annoyed at the camera the whole time, it’s zooming around is stupid. Give me a fixed position with 45 degree rotation either way!)

    • Wulf says:

      It’s the first portion of a Let’s Play is what it is. Which, quite frankly, I find rather handy.

  5. Pike says:

    I spent the first twenty years of my life thinking that Custard Creams were some sort of generic vanilla version of Bourbons (uugh) and thus avoided them. I was a fool.

  6. hookjamweasel says:

    Biscuits? What is that, something you all eat with your lamprey pie?

    It’s a friggin cookie for crying out loud.

    Biscuits are something you garnish gravy with.

    At any rate, Hamilton’s Great Adventure is pretty good…

    • Berzee says:

      Racist.
      Read some of the older Spotlights on Biscuits and you can benefit from my struggle to understand these strange and beautiful people.

    • AndrewC says:

      OK, we need an FAQ on biscuits, cookies, biscuits (American) and Americans (why they are wrong). This will sort out a lot of problems.

    • Tomm says:

      What is this madness, cookies are a type of biscuit, I will hear no more!

    • sassy says:

      We all know America is just wrong on the name. The word biscuit was around before America was colonized and therefore it is just another of the things they did to piss other English speaking nations off (like dropping the u in a multitude of words like colour)

    • Berzee says:

      You would also have to include cookies (American) in that list I think, AndrewC.

      I appreciate the last Spotlight on Biscuit, which included something that actually looked homemade (I think it was simply Oatmeal Raisin Cookies). That was nice, though controversial.

    • Edgar the Peaceful says:

      Let’s discuss Crackers and Fags and see where we get.

    • Consumatopia says:

      Dang, if it were just “Americans should say ‘biscuit’ when they say ‘cookie’”, I could get down with that, but the words actually mean different things and you need some kind of mapping between two Venn diagrams to understand them. And the American one makes more sense from the perspective of eaters (Why use one word for both cookies and crackers? I would never think to eat the one when I felt the need to eat the other–i.e. I would never say “I am hungry for a cookie or cracker right now” while apparently that is what I would mean if I said “I am hungry for a biscuit”).

    • Ergates_Antius says:

      If you said “I’m hungry for a biscuit right now” it’d be clear you were (probably) hankering for a sweet biscuit [cookie].

      If you said “I’m hungry for some biscuits and cheese” it’d be clear you wanted savoury biscuits [crackers].

      In the same way that if you were standing there holding a violin and said “can you fetch me my bow please” it’s unlikely the person would return with a piece of archery equipment.

      The meaning is determined from context, there is no real confusion.

    • Radiant says:

      An /American/ biscuit is pretty much a tasteless scone.
      In all senses of the word.

      Brit scones will make you want to rub on yourselves naked btw.

    • stahlwerk says:

      O, the old duality of crunchily baked dough. I’m guessing the term biscuit shares a root with the german Zwieback, which traditionally was a twice baked, but not necessarily sweet, bread to keep it from taking on humidity, which made it popular with the seafaring kind. Since americans didn’t care much about ships until 1880 or so, they got hooked on the decadent in-landish sweetened version cookies (german Kekse) first. And then they put more sugar in it. And then even more. Then they coated them with chocolate.

      That’s my theory, at least.

    • Eddy9000 says:

      Stahlwerk is correct, although the nearer etymology to the english is the French: ‘Bi’ meaning ‘two’ as in ‘BIcycle’, “BIsexual” and “BI-focals”, and ‘cuit’ meaning ‘cook’.
      Therefore a BISCUIT is produced by a specific cooking method and is different from a cookie.

      Even in the colonies biscuit has a different meaning to cookie, meaning a small hard cake:
      http://en.wikipedia.org/wiki/Biscuit
      But they are a strange, backwards and psychotically cheerful people, and their ignorance of baked goods and their naming should be treated with kindness and gentle pity.

    • Mr_Initials says:

      COLLOQUIALISM. Neither side will convince the other that they are right because to that specific person, they are correct.

  7. Bhazor says:

    Finally someone lays the smackdown on Oreos. Never liked those fuckers and their current inexplicable popularity across Britain should worry any true biscuit fans.

    • bill says:

      Seriously? I’m glad I left before I had to see that.

      Oreos are good crunched up in icecream, that’s about it.

      Next thing you know you’ll be telling me that Hersheys has become popular in the UK.. not that i’d ever believe that as it’s so foul.

    • TotalBiscuit says:

      Hershey’s is actually sold in the UK now. I have never seen anyone buy it. With good reason, it is inferior even to the value store-bought chocolate, let alone anything else.

    • westyfield says:

      Oreos can piss off back whence they came.
      They’re a worse version of the Bourbon, and Bourbons aren’t that great.

    • felisc says:

      hershey ? as in hershey’s cups, those unbelievably tasty tiny chocolate filled with peanut butter ? mmmh.

    • Ergates_Antius says:

      Oreos taste of tar, I’ll rather lick my cats arse than eat one. Any appearance of popularity in the country is merely down to the large advertising push they’re getting at the moment.

    • Berzee says:

      Felisc, if you really enjoyed those, you’d know they are called REESE’S CUPS.

    • Baf says:

      I’m an American and I AGREE.

      Oreos are held in esteem by people who remember liking them when they were small children. However, the reason they liked them then was that they were small children, and willing to like pretty much anything with enough sugar in it.

  8. bill says:

    The happiest day of my life was when they started selling custard creams in the Tescos in Tokyo.

    But I do suffer from the inability to stop eating them once opened.. so the whole pack (slightly larger than most tokyo apartments) goes in about 5 minutes.
    It’ll make up for all that healthy sushi.

  9. Soon says:

    The levels and aesthetics remind me of the labyrinthine design upon the face of the custard cream. Appropriate choice.

  10. Consumatopia says:

    I say the Oreo would be better without the filling, just a chocolate cookie intended for milk-dipping.

    Both our countries kind of suck at food, though.

  11. LuNatic says:

    Strewth! Poms and Yanks, Yanks and Poms, you’re all bloody wrong. Have a crack at a Tim-Tam mates, fair dinkum. Hoo-roo!

  12. Radiant says:

    MOTHERFUCKING CUSTARD CREAMS.

    If you don’t drink tea you just don’t understand, you just can’t understand.
    It’s like not being able to feel your legs.
    Not being able to feel your face they’re that good.

    Good shout with chocolate hobnobs.

  13. psyk says:

    Reply fail

  14. Radiant says:

    In the early years of civilisation, up until the recent biscuit civil war; Custard Creams traded in Bourbons as slaves.
    Fact.

    Also when Kraft created the interracial Oreo cookie it ruined his porn career.

  15. Mitthrawn says:

    You brits are so strange. Here in AMERICA, we have cookies, crackers and biscuits. Apparently in britain you have one word that means all three. Waaay too confusing. Just use the proper american terminology. Far easier. And don’t even get me started on colouuur and armouuur. Putting extra letters in words, guys? What are you, the french?

    • Berzee says:

      The “u” debate is totally unrelated here unless you are going to start writing “biscit”. I slash half of your comment from my memory.

    • AndrewC says:

      Not bad, but these tactics won’t work to troll Brits because we think you sound silly, both unnecessarily simplifying and over-complicating your language at the same time. Plus we understand the deep sadness at the heart of your nation, for when you turned your back on tea, you turned your back on the dunking of biscuits in tea, thus denying you the glory of biscuits, and tea, and little bits of biscuit at the bottom of your tea that you don’t bother to wash out immediately and so dry there forever, right in the angle between floor and wall of the cup so making it impossible to ever remove so we had The Isle Of Man built entirely out of crushed tea cups that had become unusable through build up of old biscuits. This is our heritage. Yours is an embarrassing simulacrum involving donuts (sic) and coffee that’s like a child trying to mimic its parent without any understanding of what their parent is doing. And you smell.

    • fuggles says:

      Well our language comes largely from French, so yeah. What’s with spelling laser with a Z when it’s an acronym?

    • Eddy9000 says:

      He’s got a point though, the way the newley formed English colonies have bastardised the ancient American language is unforgiveable.

      Wait, what?

    • Raiyan 1.0 says:

      Metric > Imperial

      Only vaguely tangential, but had to be said.

    • Oak says:

      Metric units have no personality. I oppose the system on aesthetic grounds.

    • Raiyan 1.0 says:

      @Oak: Well, yeah, the Imperial System does have a personality.

      A sadistic one at that, too.

  16. Lars Westergren says:

    This biscuit vs cookie vs crackers debate is way too Anglo-saxon-centric. I demand you acknowledge the rich tradition of Scandinavian sugary buttary breads. Like for instance….

    Actually, let me get back to you on that. Don’t go anywhere!

    • Lars Westergren says:

      Ooh, ooh! I remembered one! Ginger snap cookies. They are awesome.

    • Bhazor says:

      Never tried Scandinavian biscuits but Danish biccies, now they be a fine treat. Also the German almond horse shoe has a special place in my heart as they were the type my gran used to make.

      Reason enough I feel to join our brothers in the E.U.

  17. Bhazor says:

    What sound effect did they use for Sasha? It sounds so familiar…

  18. Holybasil says:

    Are you ever going to call him Totalbiscuit instead of Total Biscuit? I mean he even comments on RPS reguarly. You’d think people knew how it was spelled by now.

  19. Lazaruso says:

    I want to hear more about Hamilton’s sweaty action-filled hour too!

  20. adonf says:

    48 thousand kilohertz sample rate for sound playback is a bit overkill

Comment on this story

XHTML: Allowed code: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>