Skip to content

Digital Foundry Reveals Nintendo Switch CPU And GPU Clock Speeds

While the official specifications for the Nintendo Switch have yet to be revealed, Digital Foundry and Eurogamer have heard from their trusted sources exactly what’s inside Nintendo’s next generation system. The site reports that we shouldn’t expect to see Switch versions of cutting-edge blockbusters, but it is capable of some ports.

“Where Switch remains consistent is in CPU power – the cores run at 1020MHz regardless of whether the machine is docked or undocked. This ensures that running game logic won’t be compromised while gaming on the go: the game simulation itself will remain entirely consistent. The machine’s embedded memory controller runs at 1600MHz while docked (on par with a standard Tegra X1), but the default power mode undocked sees this drop to 1331MHz. However, developers can opt to retain full memory bandwidth in their titles should they choose to do so.”

“As things stand, CPU clocks are halved compared to the standard Tegra X1, but it’s the GPU aspect of the equation that will prove more controversial. Even while docked, Switch doesn’t run at Tegra X1’s full potential. Clock-speeds are locked here at 768MHz, considerably lower than the 1GHz found in Shield Android TV, but the big surprise from our perspective was the extent to which Nintendo has down-clocked the GPU to hit its thermal and battery life targets. That’s not a typo: it really is 307.2MHz – meaning that in portable mode, Switch runs at exactly 40 per cent of the clock-speed of the fully docked device. And yes, the table below does indeed confirm that developers can choose to hobble Switch performance when plugged in to match the handheld profile should they so choose.”

Source

102 thoughts on “Digital Foundry Reveals Nintendo Switch CPU And GPU Clock Speeds”

  1. As long as these sources remain anonymous, I’m taking all news not revealed by Nintendo with a grain of salt regarding the Nintendo Switch, whether it’s positive, or negative.

    1. Why would providing a name make it anymore likely to be true? It’s not like you would know a Jean Peltier at Ubisoft or a Sayoru Amano at Atlus to deem whether or not they’re trustworthy and it’s likely the same sources that told them what the Switch was to begin with.

  2. Yawn… more rumors? We’re not done with this crap yet?

    Oh, I heard from a friend who works with a guy who’s uncle worked at nintendo in the 80s that all we’ve seen from the switch is just a distraction until they reveal the true console. You heard it here first. My source is very credible because he exists only in my imagination…

      1. I like how people say rumors and rumors when the rumors sounds bad on Nintendo, but we all forget that the rumors end up true and still make Nintendo look bad. This is a typical Nintendo move, and sounds as close to the truth as any dumb Nintendo move they usually do.

        1. It is a rumor. The information is based on numbers. They are trying to determine that it is X1 based when it still could be Paschal based using the numbers from the development specifications. It is a custom built chip. It is the same with the Wii U GPU. They gave up trying to figure out what the truth was. It was too much missing information for anyone to determine. All they have is the numbers but don’t know what the numbers are for. If they were developing a game they would understand what this means for them instead of trying to create information from nothing.

          1. What are you talking about? They’re not basing things off numbers, they’re reporting numbers from their sources and regardless of whether or not it is custom-built, the shader cores are based off Maxwell cores. By custom it could just mean that the little cores were removed to simplify the chip and it might have advancements in stuff like Delta color compression but don’t for a second act like the chip was built from the ground up for Nintendo.

            And how do they not know what the numbers are for when they told you what they’re for? When mobile, the GPU is 157.3/314.5 16b GFLOPS. When docked, it’s 393.2/786.4 GFLOPS. Of course, those numbers don’t tell you performance, but we do have benchmarks for the chip actively cooled at 1Ghz and passively cooled at 850Mhz so we can loosely ballpark actual performance with that.

            I don’t get how people think computers are magical and humans do not yet know how they work.

            1. The source is the developer specification table. If you read the whole source article you will see the only information given is the numbers in the table. That is what developers are given. They don’t know the cpu and GPU in the final specification. Digital Foundry is speculating based on those numbers. They have no idea what it is based on. The benchmarks are for stock products. This is custom which we don’t know what it is based on. It could be a maxwell pascal mix or maxwell mixed with something else Nvidia has. It could be pascal maxwell mix. It could be pascal mixed with something else Nvidia uses. They don’t know. No one still don’t know what the GPU from the wii u is based off of. So to answer your question, this is what I’m talking about.

              1. I did read the whole article and that table has information they apparently got from developers. Whether it’s Maxwell or Pascal is largely irrelevant as the largest difference is Pascal’s ability to clock higher since it’s manufactured with a 16nm Finfet process. They point out that they don’t actually know the ALU count and going wider rather than faster would be more power efficient but Digital Foundry is right when they point out that the Maxwell and Pascal based Tegras are both 256 cores so that’s likely a layout that Nvidia thought worked best for cost, performance, and power usage.

                Yes, those benchmarks aren’t running on the actual Switch hardware but they’re good enough to ballpark things. You can observe the size, cooling, and battery size of a device and the specs that the X1 is running at, then compare it’s max results, it’s median results, and long-term performance numbers and get a sense of the architecture’s capabilities and what some of it’s weaknesses are. We do know that’s its definitely not Pascal mixed with something else they use because both Maxwell and Pascal were designed to be mobile first and the Tegra line is the pinnacle of Nvidia’s mobile chips. There’s nothing else they can borrow from that will make it better suited for low-power consumption and their desktop GPUs are just scaled up versions of what’s used in Tegras.

                As for the Wii U, we know it’s pre-GCN, we know what graphics cards the dev kits were using, and we’ve found die shots from Brazos I believe that have ALU clusters layed out in the same way. We can safely assume it’s either VLIW-5 or VLIW-4 with either 160 or 320 ALUs with 160 being more likely.

                1. The table is from the final specification of the Nintendo switch for developers. So yes that’s where there speculation is from. My original comment is correct. They don’t know.

                  1. “The table is from the final specification of the Nintendo switch for developers.”

                    “My original comment is correct. They don’t know.”

                    What?

                    “Ball park and assume means nothing.”

                    ball park (adj) – (of prices or costs) approximate; rough:

                    assume (verb) – suppose to be the case, without proof (though I was using it to mean “educated guess”)

                    You know what, lets just live in your world where there’s no products with the Tegra X1 or Maxwell and Pascal based chips in them and nobody can figure out anything about anything without Nintendo telling them.

                    1. I will continue to live in a world of facts instead of a world of speculation. I’m glad I don’t assume whenever I’m programming. I probably wouldn’t get those contracts.

                  2. Well bravo, Tre. I’m not dealing with anything less factual than you are. People can still speculate based on good info.

                    Also, I’d love to hear what kind of programmer you are because a ton of programmers don’t know about hardware. Are you a C++ programmer where you’re code is abstracted away from assembler? Are you a Java programmer whose code is compiled to byte code? Are you web programmer like so many others whose code is running on a JS JIT in a browser on top of an operating system and makes request with a network stack through a driver that communicates with firmware to make a request from another computer entirely? Or do you deal with graphics code which almost exclusively deals with, up until recently, high level APIs made to run on GPUs from IMRs to TBRs to TBDRs?

                    Honestly, you brought up the programmer thing as if programming has to do with hand soldering logic gates like it was in the 40s and 50s.

                    1. I have been in programming for 25 years. I mentioned programming as a reference that speculation doesn’t help. You need to know all aspects of the hardware not just clock speed. Once you know the hardware, you need to know how it reacts to different software. You don’t look any one or two numbers to determine the capabilities. Even then the capabilities mean little if not used to its fullest. I’m not attacking you. I’m saying speculation is worthless.

    1. That doesn’t mean its clickbait. Nintendo doesn’t have to be the source of information for it to be true. Other people know about the Switch other than Nintendo.

  3. More info in the article itself like this.

    “But from a different perspective, this makes what we have seen even more impressive. Nintendo’s hardware is all about an all-in-one console you can take anywhere while continuing to play the same games. We fully expect to see the kinds of fare displayed in the reveal trailer fully realised: Nintendo doing what it does best, basically. Even a 307.2MHz GPU based on Maxwell technology should be capable of out-performing Wii U – and certainly the Zelda: Breath of the Wild demo seen recently on the Jimmy Fallon show revealed a level of performance significantly smoother than that seen in last year’s E3 code running on Wii U hardware. We should also remember that Nvidia has produced a bespoke software layer that should allow developers to get much, much more from the processor compared to what we’ve seen Tegra achieve in the Android-powered Shield console.”

    I think it’s obvious that we shouldn’t have expected cutting edge blockbusters like RDR2 or anything like that. I’m no tech guy but I suggest that those who are read the full article.

  4. And for those who are so quick to dismiss “rumors on the internet,” there are plenty that I’ve seen that turned out to be very true. Sometimes there are real people with real sources. I’m especially surprised to see this reaction to something involving Eurogamer who perfectly nailed what the Switch would be months before confirmation.

    1. This is a rumor/speculation. They are saying this information based on the developer specification. The specification don’t say anything about how the custom based chip is created.

  5. Wii U-GPU = 550 Mhz, Switch-GPU (in handheld mode) = 307 MHz, but “Breath of the Wild” still runs smoother on Switch than on Wii U! Can someone explain this?

    1. Clockspeed is only one of many factors, so you can only really compare otherwise identical processors like that.

      If the rumors are true about this Tegra processor being based on newer architecture then the old one (which is highly likely) we have no way of comparing. So in this case, like in many other cases, these numbers are basically meaningless.

      1. I don’t know what footage you saw, but what I saw was stable 30fps. Which couldn’t be more, since the video itself was only 30fps…

          1. The problem with the first switch trailer was the overall framerate lag on recording. The video in itself had lower fps which definitely made zelda look crappier than it prolly is

    2. Clockspeeds are bullshit…

      There are two different CPU architectures
      The Wii U uses IBM PowerPC architecture and the Switch uses ARM architecture.
      With an ARM CPU, developers can squeeze out more performance than with an IBM PowerPC CPU.
      Basically, the clock speeds don’t really matter.
      Look at the old Intel Pentium 4. It ran at 3Ghz (iirc), but the chip in the Wii U could run circles around it. New chips are better despite what the numbers may tell you.

      January 12th is fast approaching!

      1. The Wii U was more powerful than the PS3 which was slightly more powerful than the Xbox 360 so if the Switch is much more powerful than the Wii U how the fuck will it have Xbox 360 level type visuals? Please slap yourself with a knife you ignorant ass troll

      1. Lol for the record the only chips I know anything about are the ones I eat so If this chip is running only Xbox 360 level than how can the switch run Zelda breath of the wild better than the wii u. So it means this claim has to be false I’m guessing

      2. In what world does the Tegra X1 not outperform the X360? Just because the only things you’ve seen it run are Android games designed to run on much less capable hardware, don’t think that that’s the peak capability of the X1. The X1 has more capable shader cores with support for 32-bit and and 16-bit FLOPS and it has ASTC and Delta Color Compression support.

        Here’s a game running with graphics on par or better than an X360 running on either a Mali-880 or Adreno 530

        https://www.youtube.com/watch?v=73prhNQwLQs

        and here’s how those mobile GPUs compare to the X1

        https://gfxbench.com/compare.jsp?benchmark=gfx40&did1=26084812&os1=Android&api1=gl&hwtype1=GPU&hwname1=NVIDIA%28R%29+Tegra%28R%29+X1&did2=30541579&os2=Android&api2=gl&hwtype2=GPU&hwname2=ARM+Mali-T880+MP12+%28dodeca+core%29&D3=Samsung+Galaxy+S7+%28SM-G930x%29

        1. Nintendo First Order Commander Quadraxis

          ||True and it will also depend on what kind of potential damage is done in the future…||

          ||Hardware specifications are the least of my concerns but it obviously should be more powerful than the Wii U…||

          ||Fortunately I have my own conclusions on what kind of energy source the Switch will use besides its own core…||

    1. What damage control? Once the Switch got unveiled people knew it was gonna be less powerful than current gen consoles because it’s a dedicated portable hybrid system that can only be a console when it’s docked. It’s essentially two devices in one so of course certain things have to be compromised….no way you can make a PS4 level portable device unless you wanted it to be expensive and be a battery hog and plus Nintendo still hasn’t revealed everything about the Switch yet and you forget that Kimishima spoke briefly about hardware add-ons being mentioned at the upcoming presentation which could indicate to it possibly increasing the system’s power so yeah you stupid troll, like Commander said….no damage has been done

        1. Nintendo First Order Commander Quadraxis' Mom

          ||And now you have to hide your voice to sound “manly”, Kallum, you always find a way to dissapoint me…||

      1. Nintendo First Order Commander Quadraxis' Mom

        ||No need to insult those trolls, just let time to show them how fools they are…my logic is undeniable…||

        ||As a parent you should show some example tho those people and act like a mature person, it’s only videogames…it’s not war…||

  6. If this is true it would really be a bummer. I don’t know what to believe anymore. Eurogamer and Digital Foundry have proven to be pretty reliable, when it comes to Switch rumors, and those numbers seem to be pretty accurate. On the other hand did Nvidia itself say, that the Switch runs on the same architecture as their top graphic cards, assumingly their GTX10 series. Also I can’t really believe that developers like Take Two, From Software, Square Enix and Bethesda, which games are highly demanding usually, praise an underpowered console, as they did since the announcement. Let’s wait and see…

    1. Since it’s inception, the Tegra line of processors has been advertised as bringing the power of Nvidia’s Geforce graphics cards to mobile, but it’s largely been marketing. They’re not lying that it’s the same architecture as their high-end cards but architecture refers to how groups of processing cores are interconnected and in what ratio they’re paired with texture, geometry, and raster output units. It doesn’t mean that the chip will have the same amount of processing cores or that they’ll be clocked in a similar way.

      For example, the Tegra X1 has two SMMs(256 CUDA cores) clocked at up to 1Ghz with support for LPDDR3 memory at 1600Mhz giving it up to 25.6GB/s of bandwidth, while a GTX 980 has 16 SMMs (2048 CUDA or the same cores minus double-rate FP16 support) clocked at 1.126Ghz with support for GDDR5 memory at 3.505Ghz giving it 224GB/s of bandwidth.

      So it’s the same architecture but scaled differently.

  7. I would gladly pay an extra $100 for consistent and reliable hardware so games can be played at a greater quality. Whether this rumor is true or not it doesn’t surprise me. Nintendo is known for slacking on hardware and it’s getting a little old. The same magic of Nintendo can be instilled in good hardware. The idea of playing Breath of the wild in 30 fps or less is terrifying and disappointing. Merp. Just my opinion.

  8. so it has a weaker cpu than wiiu 1/3 of the bandwidth and a weaker gpu

    those specs wouldnt run zelda breath of the wild a 300mhz plus gpu clock and a 15gb bandwidth LOL that is far far far far far weaker than a wii u

    wiiu has 70gb bandwidth in the edram alone and around 15gb in the main ram and no one runs a gpu at 700mhz plus whilst the cpu is only 1020ghz….why not just run them 1to1 at 1020ghz

    COMPLEAT CRAP

    a od the shelf x1 at full clock speed has NEVER EVER COME CLOSE TO A WIIU NOT EVEN REMOTLY CLOSE DOOM FROM 10 YRS AGO AND A HALFLIFE 2 WITH XBOX ORIGINAL GRAPHICS DOESNT COUNT AS EVEN CLOSE RTO WIIU

    so digtal foundrys COUGH at 300mhz it can still be better than wiiu is the biggest bs iv ever read there isnt a single game on tegra x1 at 1ghz gpu and 2ghz cpu that is even remotly close to wiiu graphics

    1. Hey Bowler. I think I’ve argued with you before.

      “15gb bandwidth”

      actually it’s 25.6GB/s. It’s DDR memory so it’s 1600Mhz x 2 (double data rate) x 64-bit bus = 25.6GB/s.

      That’s exactly twice as much main memory bandwidth as the Wii U or a little under twice as much when clocked at 1331 (21.3GB/s). It would effectively be more because mobile GPUs support ASTC so even Wii U quality textures would use significantly less bandwidth than they did on the Wii U.

      “wiiu has 70gb bandwidth in the edram alone”

      We’re not sure if this has any EDRAM as well and if we were to use the Wii and GC for reference, the Wii U’s EDRAM may have only used a 512-bit bus making it 35.2GB/s. New Nvidia GPUs do have pretty fat L2 caches, delta color compression, and ASTC support so that might take a significant amount of strain off of main RAM. Oddly enough, that kind works out. If the color compression reduced G-buffer bandwidth by half(might be possible) and ASTC reduced texture bandwidth by half(that’s realistic) than that’s 17.6 + 6.4 = 24GB/s.

      “no one runs a gpu at 700mhz plus whilst the cpu is only 1020ghz….why not just run them 1to1 at 1020ghz”
       
      yes, they do. The Wii U had CPU cores clocked at 1.24Ghz while it’s GPU and EDRAM was at 550Mhz. In fact, most SoCs have GPU and CPUs running at different clock speeds. The reason Nintendo wouldn’t run both at 1020Mhz is because of power usage and heat. The Shield TV used about 20 watts which is a lot of power to run on a small battery. The Shield TV also had active cooling which kept it’s performance pretty consistent as you would see if you compared top and median scores in GFXbench. When the X1 was used in a tablet, the GPU was clocked down to 850Mhz for power reasons and it’s passive cooling meant that it would throttle down over time. When you’re making a gaming system, throttling isn’t an option and neither is overheating so it needs to be clocked down to a speed it can run at consistently without overheating.

      “a od the shelf x1 at full clock speed has NEVER EVER COME CLOSE TO A WIIU NOT EVEN REMOTLY CLOSE DOOM FROM 10 YRS AGO AND A HALFLIFE 2 WITH XBOX ORIGINAL GRAPHICS DOESNT COUNT AS EVEN CLOSE RTO WIIU ”

      Here is the Tegra X1 running the 3DMark Slingshot test at 2560×1440 with OpenGL ES 3.1. The use of a lower level API would increase performance even more. The Wii U would have struggled to run this at 720p. Also stop yelling.

  9. Ok!

    Who gives a flying fudge!

    Just bring on the games with amazing gameplay and decent graphics where I can enjoy the game…. like I have been doing for 30+ years already without these oh so amazing graphics expectations of today (which don’t exist btw, only shown then down scaled on the consoles after the presentations *cough uncharted cough*)

    When will people grow up and realise there is more to gaming than fudging graphics…. if you want to watch a movie minus gameplay and any depth… then go to the movies or slap on a blue ray, then will the graphic munchers please leave the real gaming to the people that care about just that….the gaming aspects!!!!

    *still mumbling in my head* ”I need XMHz to enjoy a game now or else it just ain’t worth it… i don’t play a game unless it’s 1080p or 4K compatible on a 4K resolution TV………. idiots!

  10. BULLSHIT!!! Go to Wikipedia and search for Wii U CPU and GPU clock speeds. The wiki page cites a Eurogamer article that reports the information from an “anonymous hacker”. Until this day we don’t even know the details of the Wii U processors! There’s no fucking way that anyone beside the engineers and company leaders know the exact Switch specifications. Game companies only have dev kits which don’t represent the final hardware!

  11. What if Nintendo “leaked” those rumors about the clock speed of the GPU and CPU being small just to throw us off and that the true clock speed is actually higher but not to compromise battery life.

  12. The denial is strong here. This is the first stage though.
    Then rage kicks in, even though some people here are already experiencing such emotion already.
    Will be a looong road until acceptance. But don’t be ashamed to ask for help.

  13. I found a great site that focuses on stay at home mom’s complete guide to gaining a serious amount of money in very little time. While being able to earn an passive income staying home with your kids. If you are someone who needs more money and has some spare time, this site is perfect for you. Take a look at…

    follow this link…..★★★◕◡◕◕◡◕◕◡◕
    ❥❥❥❥❥❥❥ Spacial~Job AtHome.

  14. I’m getting scared also. Second report whining about how underpowered Switch is. Hopefully they reveal the “supplemental computing device” that gives us 8 terraflops. Otherwise the price point and the gimmick they are selling won’t work.

  15. “That’s not a typo: it really is 307.2MHz”

    No… not 307.2MHz… NOT 307.2MHZ!!! ANYTHING BUT 307.2MHZ!!!!!!!!!!

    … erm, what does that mean?

  16. I’m honestly worried about how Nintendo is marketing the Switch.
    I see it as a “Nintendo PSP”, in the way that is a handheld that you can plug to the TV, so, in my opinion having a portable Wii U is fantastic!
    But they are calling it a home console, and people are comparing it to the other home consoles, and, understandibly, people are getting disapointed by the specs…

  17. I love Nintendo but if this is true it will be Wii U all over again. Nintendo will have ignored everyone (the majority of people) who asked for a stronger system. They might actually become a 3rd party developer like so many were hoping. Oh well. Nothing more than a portable console a la 3ds that can connect to a tv.

Leave a Reply to MyownfriendCancel reply

Discover more from My Nintendo News

Subscribe now to keep reading and get access to the full archive.

Continue reading