Skip to content

Nintendo President Says He’s Contemplating A Nintendo 3DS Successor

Nintendo president Tatsumi Kimishima says that he’s thinking about a successor to the popular Nintendo 3DS system. Kimishima believes there’s still a market for one despite the ever growing popularity of mobile devices. The Nintendo 3DS continues to do well for Nintendo so that’s something he and the development team are looking into.

Thanks to takamaru64 for the tip!

150 thoughts on “Nintendo President Says He’s Contemplating A Nintendo 3DS Successor”

    1. More or likely they have already started. But why annouce that they have a new system coming when literally the new switch is in about a month. I believe for console at least they start as soon as they are done with the release of one.

      1. I’m actually inclined to believe they haven’t started yet because, assuming it would be made to be compatible with most of the same games, I can’t imagine why they would have used Nvidia in the Switch otherwise. Nvidia’s chips really were never well suited for devices that small. The Switch isn’t even out yet and there are smaller devices with chips more capable than the Switch when docked and they don’t even have the advantage of having giant ventilation holes.

        The 3DS successor will likely have an SoC based of an GPU by someone else and will probably outperform the Switch’s mobile mode (or all together) and pretty much completely defeat the purpose of the Switch being portable.

        1. Keep dreaming. The advantage of the 3DS is that it’s a cheap platform to develop games for. If it’s going to be more powerful than the Switch games are going up in price and the handheld will lose it’s position in the market (the low end market). They still have to aim to a sub-Switch market if they want ro reach that kind of developers and customers. There will be already the Switch, no need for another Switch, just a cheap handheld as always.

          1. It’s not a dream at all. I have info for days. The Switch’s performance, especially in handheld mode, wouldn’t be that hard to beat while still staying cheap. If the 3DS successor isn’t more powerful than the Switch than it would be because Nintendo is trying to make it less powerful. By the time the 3DS successor comes out, things will be manufactured on a 10nm Finfet process allowing chips to be even more powerful. All they would have to do is slap a Mali or PowerVR GPU and a few A73 cores onto an SOC and you’ll have something smaller and more powerful than the Switch.

            1. It’s just 10x than a Vita, it’s just 100x than a 3DS with a 4400 mah battery. Yes… easy to beat. Meh.
              Certainly technology evolve, that doesn’t mean that low-end should see the inclusion of a reactor instead of a decent SoC.
              A73 are just 15% more powerful than A57, and it’s just a CPU, and no… PowerVR isn’t actually more performant than nVidia per watt.
              It’s a 6.2 inches screen size, if Nintendo don’t pursue its 4 inches screens size for their next Nintendo 4DS (with that same 1500 mAh battery) they aren’t healthy of mind.
              They need to cover different markets: high end and low end. Like now.
              Probably a 10 nm process will be costly in 2 years time so they will stay to a cheap 20/28 nm for the low end (20 nm/28 with a low-powered SoC that need a little battery like now). Actually it’s just 45 nm. You can’t just think ‘slap this and that’ without understanding how Nintendo make its hardware.
              Switch is high-end, it’s only because of that that’s been chosen a 20 nm process.

              1. PowerVR actually IS more performant per watt. That wasn’t true when Nvidia did their power tests between the Tegra X1 and the A8X, but the A8X used a 6XT based GPU. The 7XT series was available for licensing several months before that and increased performance 60% clock-for-clock bringing the performance of the A9 up to the performance of the A8X. Then within the past year, Imagination worked with TSMC to improve the performance and efficiency of the 6XT and 7XT designs and, while they didn’t say what configuration they were using for testing, they were able to exceed their initial goal of 600Mhz @ 2.5w and achieved 660Mhz at 2.35w. As a result, the 6 cluster GPU in the A10 Fusion is 50% faster than the one in the A9 while using less power and being manufactured on the same process.

                The Pixel C is the only example of the X1 running in a battery powered device and the iPad Pro’s long-term performance is twice as high as the Pixel C with only 15% less battery life despite running at 21% higher of a resolution  on a screen that’s 26% larger with a battery that’s only 13% larger.

                The iPad Pro’s top performance is also higher than the Shield Android TV even though the Shield TV has the advantage of active cooling and the even the iPhone 7’s top performance is within 6% of the Shield TV.

                1. Are you joking? A9x have roughly half the computational power of the Maxwell powered Nintendo Switch SoC. And that’s (A9x) a tablet battery hungry chipset. You are talking about an even less powerful SoC (the A8x has roughly 1/3 the power of the custom SoC inside the Nintendo Switch).
                  Numbers for numbers… the GT7800+ inside the A9x is 345 GFLOPS. The Switch one should be over 500… rumored nearly 700.

                  1. You don’t compare things based on their theoretical performance AKA GFLOPS, you judge things on actual performance.

                    Go to gfxbench.com and look up the devices yourself. Even though the Shield TV’s X1 is 512/1024 GFLOPS, the 345/691 GLFOP A9X does out-perform it.

                    PowerVR uses separate 16-bit ALUs that allow them to more frequently process 16-bit flops at double speed while Tegra relies on bonding instructions which doesn’t always work.

                    On top of that, PowerVR is using a TBDR architecture allowing it to do a much better job avoiding unnecessary shader operations and textures fetches while being able to render the entire g-buffer using on-chip memory and avoiding external memory traffic.

                    1. Nope, you do compare about GFLOPS, like you do with a PS4 vs. an Xbox One.
                      Better than it? I’ll trust it when I will see it. For now I have seen top graphics on the Nintendo’s console, not on the iPad.

                      You should know that 16-bit isn’t used for 3D graphics but for other applications (financial, scientific, etc.) so I don’t really see the point. It’s useless to talk about ALUs if you don’t show the application for it. You can theoretically tell me that the Savage3D has a ‘wow’ feature that crush the competition, that does not make the Savage£D more powerful than the competition. Lots of chit-chat with the old Jaguar too, than was a joke compared to the PlayStation.

                      Facts are two:

                      1. GFLOPS + ROPs + TMUs.
                      2. Actual games, 3D graphics shown.

                      Nothing else matter, nor 8-bit ALUs.

                      1. The reason you can compare GFLOPS between the XBO and PS4 is because they’re based on the same exact architecture. Maxwell and Roque aren’t even made by the same company and one is TBDR while the other is IMR so they can’t be compared without benchmarking.

                        The reason you don’t see “top graphics” on the iPad is because games on the app store are designed to work on a wide variety of hardware (including older phones and lower-end Android phones) not just the newest hardware. The iPad Pro also has a 2732px x 1536px screen while the Switch has a 1280×720 screen.

                        16-bit FLOPS CAN be and ARE used for 3D graphics. Fragment shaders deal with pixels which start off as 16-bit floats and end up as 8 or 10 bit integers. 16-bits is more than enough precision for them. In fact, all of Uncharted 2’s fragment shaders were 16-bit because the PS3 could only do 16-bit fragment shaders. 

                        You’re forgetting bandwidth, too. 3D graphics require a high amounts of bandwidth which is something TBDR is designed to get around.

                        Perhaps you might be interested in the PDF that Epic did about running Unreal Engine 4 on mobile GPUs back in 2014.

                        https://cdn2.unrealengine.com/Resources/files/GDC2014_Next_Generation_Mobile_Rendering-2033767592.pdf

                        And here’s an example of “top graphics” on an Exynos based Galaxy S7 running at 1080p.

                        And here is the Dwarf Hall demo running on an 8 cluster PowerVR GPU showing physically-based rendering at 4k at 60fps.

                        1. Look, people compare GFLOPS even in the eterogenous PC market.
                          Actual Switch games are Wii U ported games so…
                          You can do easily a demo on the A9x if you want to that would make a shame of Skyrim. But there is not.

                          32-bit:
                          ‘Full float precision is generally used for world space positions, texture coordinates, or scalar computations involving complex functions such as trigonometry or power/exponentiation.’

                          16-bit:
                          ‘Half precision is useful for short vectors, directions, object space positions, high dynamic range colors.’

                          Also:
                          ‘First, Nvidia’s triple jump in compute is a bit misleading, since they are comparing the GP100’s 20 TFLOPS performance with 16-bit precision (half precision) compared to Maxwell’s 7 TFLOPS with 32-bit precision (single precision). Now, applications that can accept the use of lower-precision half floats could see up to three times the throughput. But those relying primarily on single-precision (3D graphics rendering primarily) would see only about 50% better (maximum).’
                          Those relying on single-precision (32-bit): 3D graphics rendering.

                          And yes, with limits you can do whatever you want, still 32-bit is used for 3D graphics. There are sub-routines that use 16-bit? It’s obvious. Is it 16-bit of any importance? Nope.

                          1. If it was possible to judge the capabilities of two GPUs just by comparing peak theoretical FLOPS then why would anybody do benchmark? I’ll tell you why. Because each architecture has different strengths and weaknesses. Do you think there is ever a point where an IMR GPU capable of 20 TFLOPS is going to reach anywhere near it’s theoretical performance connected to some slow DDR memory? No. Because FLOPS are just one part of the puzzle. Even if that GPU has sufficient memory bandwidth to work with, it may experience a lot of stalls which prevents it from reaching anywhere near it’s top performance. Theoretical performance doesn’t matter. Real performance does.

                            I’m not sure what point you were trying to prove by posting those two  definitions from the Unity documentation. It explains that full precisions is used for vertex shaders while half-precision is used for fragment shaders which is the majority of the work in modern games. After all, if half-precision had next to no use for 3D games, why would it be mentioned in the documentation for game engine?

                            Modern games use deferred rendering to prevent overdraw. The first step in a deferred rendering pipeline is to create a G-buffer which is the process of rendering the geometry to different textures that represent the color buffer, normal buffer, depth-buffer, and sometimes other buffers like AO, metalness, stencil, etc. These textures are then used by fragment shaders to do lighting, normal-mapping, bokeh, color grading, etc. The most common formats for color buffers are four channels of 8-bit integers or four channels of 16-bit floats for HDR rendering while normal buffers are often made up of two 16-bit float channels. For example, Uncharted 4 packs a bunch of different data into what looks like a 128-bit g-buffer. None of that data exceeds 16-bits of precision. Here’s a list of different G-buffer layouts from popular games.

                            http://d.hatena.ne.jp/hanecci/20130818/p1

                            Essentially, after creating the G-buffer, a large portion of the remaining work can be done with half-precision math. That is what mobile GPUs choose to do and that’s what the PlayStation 3 HAD to do. Doing things at unnecessarily high precision is just a waste of bandwidth and power.

                            Switch games are not just ported Wii U games. Arms, Xenoblade Chronicles 2, and Mario Odyssey are all original games. Even then, why does it matter if a game is a Wii U port? They’re not being ported from some exotic architecture like the PS3, its being ported from one straight forward to design to another.

                            I’m sure none of this is going to convince you of anything though and that’s probably because you refuse to look at benchmarks. Here, I’ll do the work for you. These are benchmarks between the iPad Pro and Shield TV. If you want see results for Manhattan 3.1 and 3.1.1, then change the iPads API to Metal. Although Metal is supposed to be a lower-level API, it seems these benchmarks are still better optimized for OpenGL since the Manhattan and T-Rex benchmarks seem to perform better in OpenGL.

                            https://gfxbench.com/compare.jsp?benchmark=gfx40&did1=27138730&os1=iOS&api1=gl&hwtype1=GPU&hwname1=Apple+A9X+GPU&D2=NVIDIA+Shield+Android+TV

                            I’m not sure what point you were trying to prove by posting those two  definitions from the Unity documentation. It explains that full precisions is used for vertex shaders while half-precision is used for fragment shaders which is the majority of the work in modern games. After all, if half-precision had next to no use for 3D games, why would it be mentioned in the documentation for game engine?

                            Modern games  uses deferred rendering, to prevent overdraw. The first step in a deferred rendering pipeline is to create a G-buffer which is the process of rendering the geometry to different textures that represent the color buffer, normal buffer, depth-buffer, and sometimes other buffers. These textures are then used by fragment shaders to process lighting, normal-mapping, bokeh, color grading, etc. The most common formats for color buffers are four channels of 8-bit integers or four channels of 16-bit floats for HDR rendering. The normal buffers are made up of two 16-bit float channels. For example, Uncharted 4 packs a bunch of different data into what looks like a 128-bit g-buffer. None of that data exceeds 16-bits of precision. Here’s a list of different G-buffer layouts from popular games.

                            http://d.hatena.ne.jp/hanecci/20130818/p1

                            Essentially, after creating the G-buffer, a large portion of the remaining work can be done with half-precision math. That is what mobile GPUs choose to do and that’s what the PlayStation 3 HAD to do. Doing things at unnecessarily high precision is just a waste of bandwidth and power.

                            Switch games are not just ported Wii U games. Arms, Xenoblade Chronicles 2, and Mario Odyssey are all original games. Even then, why does it matter if a game is a Wii U port? They’re not being ported from some exotic architecture like the PS3, its being ported from one straight forward to design to another.

                            It seems you’re not convinced of any of this though and that’s probably because you refuse to look at benchmarks. Here, I’ll do the work for you. These are benchmarks between the iPad Pro and Shield TV. If you want see results for Manhattan 3.1 and 3.1.1, then change the iPads API to Metal.

                            https://gfxbench.com/compare.jsp?benchmark=gfx40&did1=27138730&os1=iOS&api1=gl&hwtype1=GPU&hwname1=Apple+A9X+GPU&D2=NVIDIA+Shield+Android+TV

                            1. That benchmark is useless and you know it. Different systems, different optimizations, no real-world games.
                              A9x is a 400 MhZ GPU capable of 350 GFLOPs, there is no chance that would be more powerful than that nVidia technology, nor exist a single A9X demo that can compare to a Switch demo. Also if you look up there you were sying that A8x was more efficient than the Tegra… and it is not, I attached a document too.
                              3D graphics is made of 32-bit floating-point, any article would say that 32-bit makes the most of computational requiements. Any.

                              1. Please read carefully. This has been the fourth time that you said I was saying the A8X is more efficient and I have told you four times that the A8X is NOT more efficient than the TX1. How do you keep interpreting me saying the opposite?

                                What I DID say about the A8X versus the TX1 in the last post was that the TX1 had the advantage of being newer and was able to get the same amount of bandwidth using fewer pins and acessing memory does use up a decent amount of power. The other thing I brought up in relation to the TX1 vs the A8X and the A9X is that the TX1 in both the Shield TV and the test that Nvidia did had cooling. In the case of Nvidia’s test, there was a large heat sink on the TX1. In the case of the Shield TV, it has a small heatsink and fan. These are the performance results Nvidia got and showed off (taken from the article) vs the A8X.

                                TX1 / A8X
                                Manhattan – 4077 (65.8 fps) / 2480 (40.0 Fps)
                                T-Rex – 6957 (124.2 fps) / 4231 (75.6 Fps)

                                That’s a huge difference. The TX does 64% better in both Manhattan and T-Rex. Let’s see what happens when you take the heatsink away from the TX1 though (I used Pixel C scores).

                                TX1 / A8X
                                Manhattan – 2965 (47.8 Fps) / 2480 (40.0 Fps)
                                T-Rex – 4753 (84.9 Fps) / 4231 (75.6 Fps)

                                Now the TX1 is only beating it by 17% in Manhattan and 12% in T-Rex. I’M NOT SAYING THAT MAKES THE A8X MORE EFFICIENT. The TX1 has better performance per watt BUT, without cooling, the TX1 loses a lot of it’s performance in a comparably heat constrained situations. The Pixel C is also larger than the iPad Air 2 so the TX1 is still at an advantage in this regard. Since you think mobile benchmarks use a huge amount of draw calls, I’ll also point out that the iPad Air 2 also has a lower Driver Overhead score (1437) than both the Pixel C (1440) and the Shield TV (1759). You don’t care though because it’s still iOS vs Android and, at this point, you’re too invested in the TX1 to change your mind now, but I’ll try anyway.

                                Here’s a comparison you might like. I’m gonna take two Android devices, both with PowerVR Series 6 GPUs, one is 89.6 GFLOPS (ZenPad), the other is 136.4 GFLOPS (ZenFone), and I’ll put them up against the Series 6 115.2 GFLOP GPU in the iPhone 5s. This is as fair a test as I can get between the two platforms.

                                ZenPad 3S / ZenFone 2/ iPhone 5s
                                Manhattan 3.1.1: 205 / – / 169 (Huh. Looks like the 90 GFLOPPER wins)
                                Manhattan: 3.1 – 405 / 414 / 333 (iPhone comes last)
                                Manhattan: 869 / 907 / 718 (One for the iOS)
                                T-Rex: 1744 / 1355 / 1559 (Another for Android)

                                And here’s the low-level tests which, with the exception of the actual driver test, don’t seem actually get effected by drivers or the OS.

                                ALU2: 949 / 1375 / 1212 (Exactly what you would think)
                                Texturing: 2101 / 2262 / 2426 (iPhone wins this one)
                                ALU: – / 5552 / 4329 (Exactly what you would think)
                                Alph Blending: – / 5558 / 6445 (iPhone wins this one, too)
                                Driver Overhead: – / 324.5 / 33.3 (iPhone had 1/10th the overhead!)
                                Fill: -/ 3065 / 2566 (Exactly what you would expect yet again)

                                It doesn’t look like you have case about the Tegra X1 being held back Android. When a PowerVR chip is used in the same operating system, it sometimes does better, sometimes does worse, but mostly runs exactly how you would expect.

                                You got your specs wrong for the A9X btw. It’s either 450Mhz or 467Mhz, not 400Mhz. The fact that you would say it’s 400Mhz AND 350 GFLOPS shows, yet again, that you don’t know where GFLOPS ratings come from. If you really want to compare theoretical GFLOPS though, then lets do that. A 1Ghz TX1 (512/1024) would be 42% ahead of the A9X (358.7/717.3). The Switch’s docked clock speed is 768Mhz. People are assuming this is the highest it can go without thermally throttling. Judging by the roughly 25-30% performance dip the TX1 takes in the Pixel C, this looks to be accurate. That brings the Switch to 393.2/786.4 GFLOPS which is only 9.6% higher than the A9X. When the Switch isn’t docked, then its 307.2 Mhz or 157.3/314.6 GFLOPS. So, even in your world where theoretical performance numbers are real, the A9X and TX1 are atleast very comparable given the same thermal situation and you can’t make a case for the A9X being weaker than the Switch.

                                But let’s stick to theoretical numbers for a minute. The iPhone 7 is a 257.3/514.5 GFLOPS part (assuming the clockspeed really is 670Mhz) that apparently lasts for 2-4 hours on a 7.45 Wh battery. But if TX1 is a more efficient chip, than why is it getting 2.5-3 hours on a 15.9 Wh battery at 40% lower performance (157.3/314.6 GFLOPS)? Even the iPhone 6s had a higher GFLOPS (172.8/345.6 GFLOPS) than the Switch on a battery and was getting 2 hours out of a 6.55 Wh battery. Seems that, theoretically, even an A9 would have slightly higher performance while getting 4.85 hours on the same battery and the A10 would get much better performance for 4.2-8.5 hours.

                                This is the real world though and theoretical performance is precisely that: theoretical. In the real world, things like TBDR vs TBR vs IMR, if it can process 16b FLOPS at double speed, bandwidth, and the general efficiency of it’s ALUs, ROPs, and TMUs all matter. And yes, fp16 is useful for graphics. fp32 is way more precision than you need for fragment shaders and using them over fp16 is a waste of bandwidth and power. Like I said, the PS3 could only do fp16 fragment shaders and not once was there a claim that the PS3’s post-processing looked lower-precision than X360s. Quite the opposite was true considering Uncharted 3 and the Last of Us were considered the best looking games of last generation. Even the double speed INT16 and quadruple speed INT8 that PVR, Mali, and I think Adreno have, would be useful in 3D graphics especially for GameCube and Wii emulation. The GameCube and Wii’s GPU uses 8-bit INTs for it’s TEV, Fog, ZCompare, Blend, Bump, Frame Buffer, Rasterization, Lighting, Indirect Texture, Clipping, and Culling as was explained in his Dolphin blog post.

                                https://dolphin-emu.org/blog/2014/03/15/pixel-processing-problems/

                                You might notice that those are all part of the pixel/fragment pipeline. They haven’t currently moved all the code over to INT yet because of performance reasons but in cases where they can use lower precision INTs, there would actually be a speed up on mobile architectures.

                                The reason for all of this is that graphics for gaming are not about accuracy, but approximations. If a dev can get away with using int8s or fp16s and a large speed boost out of it, then they’re going to do it.

                                One more thing. Since we established that the low-level tests aren’t that effected by the OS or driver, lets look at how the A9X and TX1 compare. Feel free to ignore them anyway.

                                Shield TV / iPad Pro
                                Texturing: 12.3 / 13.8 Gtexels (Pixel C was 9.2 Gtexels)
                                ALU: 466.1 / 308.1 fps
                                Alpha Blending: 21.6 / 24.9 GB/s
                                Fill: 14.6 / 15.9 Gtexels
                                ALU2: 94.4 / 137.1 fps (Pixel C was 81.5)

                                1. Why you continue to post results of the benchmark? Haven’t we reached a point in saying that a benchmark on two different platforms is severely flawed?
                                  Where have you found clock rate for Tegra? At that frequency it would even less powerful than the Wii U. I think it’s wrong data.
                                  And comparing the iPhone 6, a smartphone, to the Switch demonstrate how useless had become this discussion.

                                  1. Well we know that 1080p is 2.25x more resolution than 720p and the max clock of the Tegra X1 is about 1Ghz. So the even if it was at exactly 1GHz when docked, it would be about 444Mhz when undocked.

                                    Digital Foundy posted that the Switch developer documentation said it’s 768Mhz when docked and 307.2 when not. Another rumor that spread previously was from someone who works at Foxconn and not only claimed the exact battery capacity that we know to be true now but said that the JoyCons would come in Orange and Blue. That leaker said it was clocked at 921 Mhz when he was testing it.

                                    All those clocks make sense if the Tegra X1’s base clock is 76.8Mhz. A 4x multiplier makes it 307.2, 10x multiplier makes it 768, and 12x makes it 921.6Mhz. The belief is that they were testing to see what clocks they could hit without getting too hot. Again, when you look the performance drop of the Pixel C relative to the Shield TV, it seems it throttled down to between it’s 9x and 10x multiplier which would put it at 691.2-768Mhz.

                                    And read the whole post. I know they’re long but I just proved that the nearly identical GPUs performed better on Android than iOS in 3 out of 4 tests and low-level benchmarks gave results exactly as you would expect. Those benchmarks compare Android vs iOS and show that they ARE comparable. Seriously, you know that PVR was used in Android phones and tablets before and you never thought to investigate how differently they performed across platforms? Well of course not. We’ve been arguing for days now and your pretty committed to this Tegra thing so you can’t admit your wrong.

                                    And how is comparing a smart phone to the Switch useless? Both are portable devices that need to perform in heat constrained environments.

                                    Here’s what’s pretty obvious with you. You have a bias for the guys who make desktop video cards. You think 32-bit FLOPS are all that matters because desktop cards don’t use 16-bit, you’ll give Qualcomm credit for their GPU but only because you think there current design is still based off AMDs design from 10 years ago, and you can’t imagine that Nvidia’s chips aren’t the best suited for mobile usage because they’re by far the most energy efficient in the desktop space.

                                    Actually, even according to your logic, it would be on-par with the Wii U at those clock speeds. It would be 157/314.5 GFLOPS. Wii U was likely 176 GFLOPS though some say it’s 352 GFLOPS. The Maxwell architecture is more efficient so it out performs those numbers. Again, GFLOPS ratings are peak THEORETICAL performance. If you don’t believe me then check out this post from the Nvidia forums.

                                    https://devtalk.nvidia.com/default/topic/415741/cuda-programming-and-performance/gpu-perfomance-how-much-gflops-/

                                    “Bottom line is: what you’re given in the specs is peak performance flops, a very optimistic estimation based around the assumption of absolutely perfect instruction scheduling, using the “right” arithmetic instructions (a MAD and something that can be dual issued with it, commonly a MUL), and absolutely no memory bandwidth limitation (the biggest factor).

                                    You are likely never going to see this performance in your code. Unless you’re talking about double precision arithmetic (not on this GPU). You can get 80-90% of double precision peak performance much easier but it’s really because DP is so much slower than single precision it’s less likely to be limited by bandwidth.”

                                    1. Digital Foundry posted a rumour. Talking about rumours have little sense.
                                      It’s obvious that performance should match a tablet in portable mode since it depends on a battery. Though it takes all the software side optimizations that any tablet or smartphone can’t just have: API, engines, a tailored OS, etc.
                                      When on the dock it gets fairly overclocked and a game like Fast RMX go 1080p/60. You can’t do that on the iPhone, thermal restraint are tougher than a tablet, software optimizations aren’t in place, etc.
                                      In portable mode it’s probably a Wii with a just better CPU, on the dock is fairly more powerful than the Wii, infact every game run at a higher resolution. That’s what we know, besides rumors.

                                      Theoretical specs are true for every SoC, on smartphone too. We know that there is no Wii U graphics in the iPhone, that should suffice to say a Switch is better than that smartphone or many others. Then technology continue to evolve, but definitely software optimizations like on the Switch aren’t going to be on any non-pure gaming platforms.

                                      1. The reason I brought phones was as an example of battery life since a phone can’t have a battery nearly the size of the Switch unless it’s bordering on phablet sizes. I was providing yet more reasons that Nvidia was a bad choice for the Switch.

                                        What DF posted was from developer documentation. This is the same source that was the first to leak info about what the Switch was and that it was Tegra-powered. They even went as far as saying it was base on the X1 and recently ARM confirmed that the Switch is using the same A57 CPU cores that are used in the X1. Those clock speeds, like I already said, not only scale according to resolution (actually a little further), but fit the base clock multiples of the X1. Even if they weren’t true, which they most likely are, they wouldn’t be that far off from the real ones. What kinds of clockspeeds were you expecting? The most you could even expect is just one multiple up which would be 384-921.6 Mhz. Any higher and it’s 460Mhz – 1075.2 Mhz which goes above the TX1’s max clockspeed.

                                        There’s no reason to assume you can’t do games with the same graphics and performance of the Switch on current phones. The Switch’s thermal constraints are tailored for it’s docked mode, not it’s handheld mode. Besides that, it’s theoretical flops at it’s handheld clock speed, are on par with or worse than the iPhone 6s, iPhone 7, and the Galaxy S7 and the iPhones have just as much memory bandwidth while the S7 has more. By your own line of reasoning, they have to be just as capable or more capable than the Switch while having better battery life.

                                        The reason we don’t see crazy graphics on a phone is not because of the engines. Part of it is the lack of real low-level APIs like your said, but more than anything else it’s because games made for smartphones need to run on old hardware, too. Even in the last 4 years, iPhone GPUs have had as high as a 6x increase in performance yet even a great looking mobile game like CSR Racing 2 looks and plays exactly the same on both. Same thing goes for Android. Same game looks the same on a Note 4 as it does on a Galaxy S7 despite a 2-3x performance difference between them. The Note 4 and iPhone 5s didn’t even support ASTC so anyone would agree that CSR Racing 2 would be able to run with better shaders, better textures, and better resolutions on the 7 and S7. Just imagine this with better shaders, polygon counts, and textures.

                                        I would say that looks like a Switch game. Zynga apparently did have CSR2 running at 2732×2048 on the iPad Pro though. Which I presume is what this is supposed to be from.

                                        https://cdn3.vox-cdn.com/uploads/chorus_asset/file/6722399/CSR2_LaFerrari_v_Koenigsegg_01-1.0.png

                                        As for the Switch scaling up to 1080p, I’m not as confident as you are that a lot of the more demanding games are gonna be able to do it. It doesn’t suprise me that Shinen can get a great looking game like Fast RMX running at 1080p60 on the Switch because they used to be part of the demo scene. Their whole claim to fame is that they make great looking games that are also super small made this crazy sprite-based game for he GBA that looks 3D. Even when it comes to Nintendo, Breath of the Wild is releasing on the Wii U yet can only run as high as 900p on the Switch and has performance dips. And games like Splatoon 2 and Mario Odyssey both appear to running at 720p when docked. I know Mario has 9 more months of polishing before release, but Splatoon 2 isn’t that far off from launch. On the Switch, developers have to optimize bandwidth usage for TV mode and shader usage for handheld mode. If they don’t, they’re gonna have trouble scaling up. That’s why I’m suggesting that EDRAM or Tile-based rendering would have been ideal.

                                        “Theoretical specs are true for every SOC, on smartphone too”

                                        I don’t know what you were trying to say with that.

                                        1. Only the company can tell if it’s a good choice or bad, I previously told you that Nintendo is buying their (nVidia) software technology too, so they get the SoC, the API, the tools, etc. Non one knows their agreements nor the price of the entire project. It’s obvious that Nintendo could have designed a fantastic SoC, but with no API, tools, expertize, warranties, etc. it’s of no use.
                                          This isn’t a busniess about numbers, this a a billion dollars business where variables are so many that we can’t argue.
                                          Tegra X1 custom (no one knows specs, people only knows the supposed specs of the developer kit that in no way is guaranteed to be the final piece of hardware, it does not represent the Switch, but a machine to target in the initial development phase).
                                          An X1 at 300 MHz probably would not be enough for BotW so it’s all fried air. The culprit here is Nintendo that didn’t wanted to share specs publicly, and so introduced this useless speculation that’s against their interest too. It’s clear that there isn’t a single game on those smarphones graphically complex as a Skyrim.
                                          It last 3 hours on a +4000 mAh battery so it can’t be far off those smartphones, the software differences, though, is night and day. This is a ‘pure gaming’ customized system designed to do just one thing: 3D gaming. At its best.
                                          No problem surfaced in the software side, no problems with Splatoon nor Mario. Everything is proceeding quickly and the system got only praise from developers. There are no problems out there, not a single one. There will be no Mass Effect? Peace. It’s the publisher that decide. If they want, they can do it by cutting here and there, like they have always done. Do you remember portings from Xbox (64 MB) to PS2 (32 MB)? I worked in a software house and have seen a very optimized game for XB ported to PS2 when my colleagues said ‘we can’t do it’, after the producer said ‘you have to’ they did it. Developers can do everything they want if they need or want to. Peace.

                                          1. “Only the company can tell if it’s a good choice or bad, I previously told you that Nintendo is buying their (nVidia) software technology too, so they get the SoC, the API, the tools, etc. Non one knows their agreements nor the price of the entire project. It’s obvious that Nintendo could have designed a fantastic SoC, but with no API, tools, expertize, warranties, etc. it’s of no use.”

                                            Nintendo is part of the Kronos group. In what world would they not have had acess to Vulkan? It’s possible that NVN IS just Vulkan with extensions by Nvidia and Nintendo. Not sure why you brought up expertise as if Nvidia is the only company that has expertise in gaming and that Nintendo, Qualcomm, ARM, or Imagination wouldn’t know shit about anything. As for tools, you’re acting like PVR and Mali don’t already have testing tools that are freely available. Also, warranties? What?

                                            “This isn’t a busniess about numbers, this a a billion dollars business where variables are so many that we can’t argue.”

                                            Not a business about numbers? How do you figure? All businesses are about numbers. This is a tech business so it’s even more about numbers. I mean, fuck, you just said variables. What doe you think goes into those variables? Right, numbers.

                                            “Tegra X1 custom (no one knows specs, people only knows the supposed specs of the developer kit that in no way is guaranteed to be the final piece of hardware, it does not represent the Switch, but a machine to target in the initial development phase).
                                            An X1 at 300 MHz probably would not be enough for BotW so it’s all fried air.”

                                            Your own logic says that a 300Mhz X1 CAN handle BotW. Again, that would make it a 157.3/314.6 GFLOPS console rendering a game made for a 176 GFLOPS console. The Switch also has twice as much main memory bandwidth and supports delta color compression. Again, what clockspeeds WOULD make sense to you? You know there has to be a clock boost of at least 2.25x so what clock speeds make sense to you?

                                            “The culprit here is Nintendo that didn’t wanted to share specs publicly, and so introduced this useless speculation that’s against their interest too. It’s clear that there isn’t a single game on those smarphones graphically complex as a Skyrim.”

                                            Skyrim on the Switch is not that impressive of a game. We already knew it can run X360/PS3 games so why would Skyrim at 30fps be impressive? Sure, it’s the remastered version but it’s just the kind of graphics I would expect when I know the improvements mobile GPUs have made in recent years and I know that phones from 2013 were running things like Epic Citadel at greater than 30fps. Those phones could already do more than that benchmark if they were targeting 720p and 30fps. You think using newer chips that are several times more capable than those wouldn’t be able to add some ground clutter and grass? You’re losing your mind over large foggy vistas that mostly benefit from improved art style and ground clutter that easily be done with techniques similar to Xenoblade Chronicles on the Wii.

                                            “It last 3 hours on a +4000 mAh battery so it can’t be far off those smartphones”

                                            Except it is far off from them. Current phones are getting similar or greater battery life with smaller batteries. The OnePlus 3T, the Mali and Adreno based Galaxy S7, and iPhone 7 Plus are all getting comparable or better battery life despite having 3400, 3000, and 2900 mah batteries respectively. I don’t really know any phone that have a 4310 mah battery. I would also point out that they’re getting performance similar to the TX1 in the Pixel C but it seems you’re not even interested in Android to Android benchmark comparisons.

                                            “the software differences, though, is night and day. This is a ‘pure gaming’ customized system designed to do just one thing: 3D gaming. At its best.”

                                            And it would still be a customized system made for gaming if it used PowerVR or Mali. It’s like you’ve completely abandoned your point and now you’re just trying to sell me on the Switch.

                                            “No problem surfaced in the software side, no problems with Splatoon nor Mario.”

                                            Why would they have problems? What kind of problems would you expect?

                                            “Everything is proceeding quickly and the system got only praise from developers.”

                                            Yea, a good portion of that comes from their work with Unity and Epic to get their engines supported. That’s not a result of Nvidia’s envolvement. Also, we don’t know how quickly things have gone because devs haven’t been able to talk about stuff until recently.

                                            “There are no problems out there, not a single one.”

                                            Yea, sure. That’s why the Switch is getting all the X360/PS3 games that the Wii U never got. That’s sure something to be proud of. /s

                                            “There will be no Mass Effect? Peace. It’s the publisher that decide. If they want, they can do it by cutting here and there, like they have always done. Do you remember portings from Xbox (64 MB) to PS2 (32 MB)? I worked in a software house and have seen a very optimized game for XB ported to PS2 when my colleagues said ‘we can’t do it’, after the producer said ‘you have to’ they did it. Developers can do everything they want if they need or want to. Peace.”

                                            What game was this? And I know dev’s will be able to port stuff to the Switch if they want to but what does this have to do with Tegra and the Switch’s ability to scale from 720p to 1080p easily? The bandwidth isn’t there for someone to full exploit memory bandwidth in both handheld more and TV mode.

                                            You can just admit your were mistaken instead of doubling down on things.

                                            1. You make always things simple, nVidia tailor their software technology for their products. The easyness of development for the platform probably is credited to nVidia more than Nintendo.
                                              Nvidia is far more ahead than any company you cited in development tools. You ‘free’ tools don’t compare at all with nVidia’s premium software.
                                              Warranties about technology that has no quirks, about production, development times and costs, etc.
                                              No clockspeed makes sense to me, since I’m not inside Nintendo. It’s fact that Wii U software runs on beefed up hardware, so a 170 GFLOPs machine can pose problems. No one knows about it, it’s just speculation. Fact is that Mario Kart and Zelda are exactly the same on two different hardware and probably they were already exploiting Wii U hardware.

                                              Skyrim it’s graphically very complex, it runs at 50 fps on a GTX 770, and Switch version is obviously castrated, still looking very nice for a supposed tablet.

                                              It was a racing game, cut on effects and textures, some reduced models and you got it.

                                              Many games already scale to 1080p (Mario Kart 8, Fast RMX, etc.), bandwidth it’s ok. If one game will require a lower resolution it will get it. No one will moan about it, except some geek.

                                              Mistaken on what?

                                              1. “You make always things simple”

                                                Tech isn’t magic.

                                                “nVidia tailor their software technology for their products.”

                                                As do PVR and ARM.

                                                “The easyness of development for the platform probably is credited to nVidia more than Nintendo.”

                                                Yea, “probably” but that doesn’t mean Nvidia is the only company with any expertise on these things.

                                                “Nvidia is far more ahead than any company you cited in development tools.”

                                                Have you used these tools? I think not.

                                                “You ‘free’ tools don’t compare at all with nVidia’s premium software.”

                                                Premium software like what? Literally of Nvidia’s stuff is available for free, too.

                                                https://developer.nvidia.com/gameworks-tools-overview

                                                “Warranties about technology that has no quirks, about production, development times and costs, etc.”

                                                And Imaginaion, Qualcomm, and ARM all make quirky designs? They’re all industry powerhouses that have all been around longer than Nvidia. You’re just assuming Nvidia has the edge because you have a desktop GPU bias.

                                                “No clockspeed makes sense to me, since I’m not inside Nintendo.”

                                                Bullshit. If no clockspeeds make sense to you than why have you consistently said that a 307Mhz TX1 would not be powerful enough to run Wii U software?

                                                “It’s fact that Wii U software runs on beefed up hardware, so a 170 GFLOPs machine can pose problems.”

                                                No, Wii U software runs on the Wii U. What’s beefed up about it? Do you want to go into any details?

                                                “No one knows about it, it’s just speculation.”

                                                The Switch or the Wii U? Because you just claimed to drop of “fact” about the Wii U. As for the Switch, it’s called “using your brain”. Do math. Figure out what kind of additional requirements rendering at 1080p needs versus 720p, consider how hardware would scale to meet that requirement, and plug things in.

                                                “Fact is that Mario Kart and Zelda are exactly the same on two different hardware and probably they were already exploiting Wii U hardware.”

                                                You realize that the Switch’s OS is likely based off the Wii U’s OS, right? It’s possible that a lot of the code for these games didn’t have to be change, just recompiled. Yes, there was things they could do to better exploit the Wii U’s hardware but they weren’t coding to the metal or anything. They used APIs.

                                                “Skyrim it’s graphically very complex, it runs at 50 fps on a GTX 770, and Switch version is obviously castrated, still looking very nice for a supposed tablet.”

                                                I was clearly referring to the Switch version, which is not so graphically complex. Look at any footage from it that isn’t a vista like where the character is fighting something and it doesn’t look that great.

                                                “It was a racing game, cut on effects and textures, some reduced models and you got it.”

                                                Uhh okay.

                                                “Many games already scale to 1080p (Mario Kart 8, Fast RMX, etc.), bandwidth it’s ok.”

                                                You can’t say that and also say…

                                                “If one game will require a lower resolution it will get it.”

                                                If bandwidth isn’t scaling with the rest of the system, than that is a problem. Again, that’s why I think a tile-based GPU or eDRAM would have been a better fit.

                                                “No one will moan about it, except some geek.”

                                                People were immediately disappointed that BotW was only running at 900p docked and people are disappointed to see that Splatoon 2 only seems to be running at 720p when docked as well.

                                                “Mistaken on what?”

                                                Where should I start? Lets see. You are mistaken about what GFLOPs are. You’re mistaken about the usefullness of FP16. You’re mistaken about the importance of bandwidth. You’re mistaken about the usefulness of HEVC and ASTC. You mistakenly thought Nvidia is the king of all mobile GPU designers with the most powerful and efficient chips. And you mistakenly thought that the Switch is far and beyond more capable than any other tablet or phone on the market. It isn’t.

                                                1. nVidia it’s ever been the best on this. Ask to ANY developer.
                                                  You are insisting on a stupid point since ANYONE in the industry knows that nVidia tools are the best. Anyone.
                                                  This is ascertained. And this insisitence on this point shows that you aren’t aware of software development.

                                                  They don’t do top engineering, except Qualcomm maybe, still nVidia has top technology and the very best at the software side.
                                                  Plus you don’t know technology inside the Switch but only speculations. Plus nVidia has its own special API set, Qualcomm do not.
                                                  Still Nintendo could have chosen even Mediatek… they choose nVidia because it’s proven top technology at every level. Plus a very reliable partner.

                                                  Wii U software is optimized for it. Developers gets all its power, so this handheld should be at least powerful as that. At least. An half powerful GPU should not move the same games, so I suppose unofficial specs can fairly be bullshit and that it’s finally a ‘custom’ SoC like they said, optimized for Nintendo’s needs. I repeat: no one knows what’s inside. People just speculate on developers boxes that even a turd knows they sports similar technology of the final console but not THAT technology.

                                                  Wii U technology is completely different from the Switch one, anything has changed at ‘driver/tools’ level. The rest of the OS can be similar but that’s just speculation. We know for sure that the technology behind the SoCs are night and day.

                                                  FP16 isn’t needed as FP32, FP32 gets the most on 3D calculations. Fact. If you don’t know study that better. FP16 is used just for some post-processing (HDR), where full-precision isn’t needed. Maybe geometry culling. That’s it. All the rest, the meat, need FP32.
                                                  FP16 is useful for general purpose mathematics, it’s been done for that (those big farms), and publicized for 3D graphics to appeal to turds (PS4 Pro). But 3D developers needs are at FP32.

                                                  You continue to talk about ‘mistaken’ and bla bla bla but you don’t know of what you are talking about. You are just reading some site and pulling words together, BUT games are there and works flawlessly. Zelda and MarnVidia it’s ever been the best on this. Ask to ANY developer.
                                                  You are insisting on a stupid point since ANYONE in the industry knows that nVidia tools are the best. Anyone.
                                                  This is ascertained. And this insisitence on this point shows that you aren’t aware of software development.

                                                  They don’t do top engineering, except Qualcomm maybe, still nVidia has top technology and the very best at the software side.
                                                  Plus you don’t know technology inside the Switch but only speculations. Plus nVidia has its own special API set, Qualcomm do not.
                                                  Still Nintendo could have chosen even Mediatek… they choose nVidia because it’s proven top technology at every level. Plus a very reliable partner.

                                                  Wii U software is optimized for it. Developers gets all its power, so this handheld should be at least powerful as that. At least. An half powerful GPU should not move the same games, so I suppose unofficial specs can fairly be bullshit and that it’s finally a ‘custom’ SoC like they said, optimized for Nintendo’s needs. I repeat: no one knows what’s inside. People just speculate on developers boxes that even a turd knows they sports similar technology of the final console but not THAT technology.

                                                  Wii U technology is completely different from the Switch one, anything has changed at ‘driver/tools’ level. The rest of the OS can be similar but that’s just speculation. We know for sure that the technology behind the SoCs are night and day.

                                                  FP16 isn’t needed as FP32, FP32 gets the most on 3D calculations. Fact. If you don’t know study that better. FP16 is used just for some post-processing (HDR), where full-precision isn’t needed. Maybe geometry culling. That’s it. All the rest, the meat, need FP32.
                                                  FP16 is useful for general purpose mathematics, it’s been done for that (those big farms), and publicized for 3D graphics to appeal to turds (PS4 Pro). But 3D developers needs are at FP32.

                                                  You continue to talk about ‘mistaken’ and bla bla bla but you don’t know of what you are talking about. You are just reading some site and pulling words together, BUT games are there and works flawlessly. Zelda and Mario Kart are there, and upscaled on TV.

                                                  1. “nVidia it’s ever been the best on this. Ask to ANY developer.
                                                    You are insisting on a stupid point since ANYONE in the industry knows that nVidia tools are the best. Anyone.”

                                                    Yea, I’ll go ask a developer. Or you can provide a statement from a developer saying that.

                                                    “This is ascertained. And this insisitence on this point shows that you aren’t aware of software development.”

                                                    Dude, I know you said you worked at a software company that made games for the Xbox and PS2 but it doesn’t sound like you were actually part of the actual software development. Maybe you were an artist or did sound or got them coffee.

                                                    “They don’t do top engineering, except Qualcomm maybe, still nVidia has top technology and the very best at the software side.”

                                                    What is top engineering? I feel like I’m arguing with a kid. Engineering like what? Chip development? Because they all have experience with these things. They have to or they can’t make chips.

                                                    “Plus you don’t know technology inside the Switch but only speculations.”

                                                    Sure, buddy. I mean we know the clock speeds, core clocks, power usages of the X1, P1, and Xavier SoCs and have benchmarks and battery life info from products that use the X1 and can see power usage of the Switch matching known X1 devices but I guess that’s all just speculative right? I guess it’s just as likely that Switch might just have 512 cores clocked at 1.5Ghz with 100GB of memory bandwidth, right? And why use common sense when we can just use pure igorance right?

                                                    “Plus nVidia has its own special API set, Qualcomm do not.”

                                                    Like I said, NVN could just be Vulkan with extensions by Nvidia and Nintendo. You do know that Nintendo has actually made API’s before and even added custom instructions to their CPUs before, right? You’re making everybody else out like they’re chumps. You realize that all a company like Nvidia, AMD, Imagination, Qualcomm, or ARM have to do to make a graphics API is make something more analogous to the instruction sets’s their GPU’s use right?

                                                    “Still Nintendo could have chosen even Mediatek…”

                                                    Why would they choose Mediatek? Do… do you think that Mediatek owns PowerVR? Mediatek just licenses PowerVR. PowerVR is owned by Imagination Technologies, the same company that owns MIPS (CPU cores), Ensigma (Radio DSP cores and Wireless technologies), Pure (a wireless speaker company), IMGWorks (SOC DESIGN and SOFTWARE INTEGRATION SERVICES), and FlowTalk (VoIP).

                                                    “they choose nVidia because it’s proven top technology at every level. Plus a very reliable partner.”

                                                    Yea, at every level? Show me all those phones and tablets using Nvidia chips.

                                                    “Wii U software is optimized for it.”

                                                    Optimized for what? Nvidia? It used an AMD GPU.

                                                    “Developers gets all its power, so this handheld should be at least powerful as that. At least. An half powerful GPU should not move the same games”

                                                    I never suggested a “half powerful GPU” can handle the same games.

                                                    “so I suppose unofficial specs can fairly be bullshit and that it’s finally a ‘custom’ SoC like they said, optimized for Nintendo’s needs.”

                                                    Yea, “optimized for Nintendo’s needs” means that all the shit that’s uneccesary will be optimized out like the A53s and the HDMI, Dual Display, eDP, ISP, and MIP CSI parts. Here, read about it.

                                                    http://international.download.nvidia.com/pdf/tegra/Tegra-X1-whitepaper-v1.0.pdf

                                                    Honestly, do you really think they’re making a completely custom chip for Nintendo? Because I’ll tell you right now that they aren’t.

                                                    “I repeat: no one knows what’s inside. People just speculate on developers boxes that even a turd knows they sports similar technology of the final console but not THAT technology.”

                                                    Literally have no idea what you’re saying here. Something about turds.

                                                    “Wii U technology is completely different from the Switch one, anything has changed at ‘driver/tools’ level. The rest of the OS can be similar but that’s just speculation. We know for sure that the technology behind the SoCs are night and day.”

                                                    Yea, the technologies behind the SOCs are very different. That’s why, at 157 GFLOPS, it can outperform the 176 GFLOP Wii U.

                                                    “FP16 isn’t needed as FP32, FP32 gets the most on 3D calculations. Fact. If you don’t know study that better. FP16 is used just for some post-processing (HDR), where full-precision isn’t needed. Maybe geometry culling. That’s it. All the rest, the meat, need FP32.”

                                                    Could stop repeating shit back to me like 10 posts later and acting like you said it first? I never said FP16 was needed, I said it was very useful for post-processing and for fragment shaders. That’s why I repeated the fact that fragment shaders on the PS3 were all FP16. I’m pretty sure geometry culling is done on the CPU still.

                                                    “FP16 is useful for general purpose mathematics, it’s been done for that (those big farms), and publicized for 3D graphics to appeal to turds (PS4 Pro). But 3D developers needs are at FP32.”

                                                    Whats with you and turds? And no, a 3D developers needs are not all FP32. Yes, FP32 precision is needed for geometry, but after the creation of the G-Buffer, almost everything else is fragment shading. FP16 already more precision than some fragment shaders need so using FP32 doesn’t do them better. Do yourself a favor. Go download Intel Graphics Monitor, open a game, capture a frame from a newer 3D game and look at the G-buffer and the amount of Fragment shaders evoked.

                                                    “You continue to talk about ‘mistaken’ and bla bla bla but you don’t know of what you are talking about. You are just reading some site and pulling words together, BUT games are there and works flawlessly.”

                                                    Yea, because you have REAL knowledge. You don’t know what GFLOPS are dude. Shut your mouth about me not knowing stuff. All I see is you going above and beyond with your praise of the Switch. H.264 is “perfect”. Games run “flawlessly”. Nvidia and their tools are the “best”. Yet, h.265 is objectively more efficient, BotW has frame dips at 900p, and the only companies that used the X1 were Google and Nvidia in two products and a rumor from before we even knew the Switch used an Nvidia chip claimed that Nvidia begged Nintendo to be in their next system.

                                                    “Zelda and Mario Kart are there, and upscaled on TV.”

                                                    They’re on the Wii U, too.

                                                    1. I worked in the industry and know what I’m talking about. You go asking since you don’t know. I’m fine with what I’ve learned.
                                                      I was Lead QA for what is worthed.

                                                      Yeah, sure… Qualcomm and Mediatek has the same know-how… but please. -_-

                                                      Speculation. Just that.

                                                      No one optimize their drivers, tools and API like nVidia. They are top in the software side. Anyone can do good, few can do superb.

                                                      Imagination Technology isn’t known for any SoC expertize, it licenses its technology to third parties.
                                                      Also SoCs have high development cost and don’t born in a day. In fact nVidia customized for Nintendo a technology already in the market. It’s a Tegra ‘customized’, not a completely new design. Quality of the foundation is already proven.

                                                      nVidia is costly technology, because of that has high competition, also they aren’t much open software side. Tegra X1 is inside Pixel C and Android Shield. Tegra 4 was inside so many devices that it’s useless to detail.

                                                      ‘Nintendo optimized software for Wii U’ making it not easy to port on hardware inferior to that.

                                                      How can you tell it? Are you an insider?

                                                      That’s matter of speculation too… for what I know Wii U is capable of 352 GFLOPS. At 176 it would be worse than a PS3 and comparison showed that PS3 was *clearly* behind the Wii U that was even CPU-limited in many games.

                                                      You were the one talking about miracles of the Pascal architecture (FP16 improvements) vs. the Maxwell one in 3D gaming, certainly not me.
                                                      You were talking about ‘bad Nintendo they didn’t give me Pascal FP16’.

                                                      H.264 is perfect for their needs, though I already said H.265 was better so obviously if they like to integrate it it’s ‘better’, as said. And no one knows what they have decided.

                                                      X1 obey to market economics, if there are cheaper alternatives they will buy them. And definitely nVidia don’t sell their technology as cheap.
                                                      There are many tablets with 430 chipset with lame batteries, a tablet manufacturer hardly goes top.

                                                      And the Wii U was a 37 watt technology. The Switch is more powerful than that and does consume a fraction of it, likely 1/8. This is what ‘technology’ is about.

                                                      1. “I worked in the industry and know what I’m talking about. You go asking since you don’t know. I’m fine with what I’ve learned.
                                                        I was Lead QA for what is worthed.”

                                                        Oh, the lead guy-who-test-video-games. How ritzy. I think you were banking on me not knowing what that is. I had friends in quality assurance.

                                                        “Yeah, sure… Qualcomm and Mediatek has the same know-how… but please. -_-”

                                                        Never said Mediatek does have the same know-how, they literally just license things.

                                                        “Speculation. Just that”

                                                        About what?

                                                        “No one optimize their drivers, tools and API like nVidia. They are top in the software side. Anyone can do good, few can do superb.”

                                                        How many posts did you say that we can’t compare the A9X and TX1 because there was far less overhead for the A9X? You do know that Nvidia develops their Android drivers, too, right? And can you name one example of a graphics API made by Nvidia before NVN? I think not.

                                                        “Imagination Technology isn’t known for any SoC expertize, it licenses its technology to third parties.”

                                                        Again, they do make their own SoCs. They have an SoC design sub-division called IMGWorks which I literally talked about in the last post. You can’t be a company that makes licensable IP blocks for SoCs without knowing how to make an SoC.

                                                        “Also SoCs have high development cost and don’t born in a day.”

                                                        No shit. They can be made in under a year though.

                                                        “In fact nVidia customized for Nintendo a technology already in the market. It’s a Tegra ‘customized’, not a completely new design. Quality of the foundation is already proven.”

                                                        Right so don’t act like what they did with X1 and P1 isn’t representative of what’s in the Switch. Just because the SoC is being customized doesn’t mean they’re reinventing the wheel with GPU. The changes are likely in the other blocks. It’s a customized Tegra not customized Maxwell.

                                                        “nVidia is costly technology, because of that has high competition, also they aren’t much open software side. Tegra X1 is inside Pixel C and Android Shield. Tegra 4 was inside so many devices that it’s useless to detail.”

                                                        But we’re only talking about the X1 and P1. The Tegra 4 was clearly a value to manufacturers back in the day. In fact, even the K1 was used in a decent amount of devices yet the X1 isn’t. Unless Nvidia’s SoC’s surged in price, then I don’t think you have a argument there.

                                                        “‘Nintendo optimized software for Wii U’ making it not easy to port on hardware inferior to that.”

                                                        Again, who said anything about running Wii U games on inferior hardware?

                                                        “How can you tell it? Are you an insider?
                                                        That’s matter of speculation too… for what I know Wii U is capable of 352 GFLOPS.
                                                        At 176 it would be worse than a PS3 and comparison showed that PS3 was *clearly* behind the Wii U that was even CPU-limited in many games.”

                                                        Are you an insider? lol Okay, fine. We’ll say that if it makes you happy, insider. But just so you know, an X1/P1-based SoC would have to be clocked at 470Mhz to match 240 32b GFLOPS. Looks like you may have to reevaluate what you think you know about FLOPS ratings unless you think the Switch has a GPU running at 1057 Mhz when docked. Stopping comparing things using GFLOPS alone. No GPU is infinitely efficient with infinite bandwidth.

                                                        “You were the one talking about miracles of the Pascal architecture (FP16 improvements) vs. the Maxwell one in 3D gaming, certainly not me.
                                                        You were talking about ‘bad Nintendo they didn’t give me Pascal FP16’.

                                                        Actually I never mentioned Pascal lol Tegra X1 already has FP16, genius. Pascal inherited it from the X1. It’s even in the documentation I linked you to. I’ll link you to it again and I’ll include an Anandtech article saying the same thing just incase you needed more pictures.

                                                        http://international.download.nvidia.com/pdf/tegra/Tegra-X1-whitepaper-v1.0.pdf

                                                        http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/2

                                                        Also, I don’t think using Pascal would have done anything for Nintendo because Pascal’s main benefits came from it’s higher clockspeeds which came from the newer process. As for the benefits of FP16 in games, just ask Epic, they stress exploiting FP16 whenever possible in that presentation PDF that I sent you which you also didn’t read.

                                                        “H.264 is perfect for their needs, though I already said H.265 was better so obviously if they like to integrate it it’s ‘better’, as said. And no one knows what they have decided.”

                                                        Right, we don’t know what they decided but you can’t call h.264 perfect if you know that h.265 is better. The TX1 already had h.265. Would it be dumb for them to get rid of it? Yes. That’s all I’m saying.

                                                        “X1 obey to market economics, if there are cheaper alternatives they will buy them. And definitely nVidia don’t sell their technology as cheap. There are many tablets with 430 chipset with lame batteries, a tablet manufacturer hardly goes top.”

                                                        I’m assuming this is you going back to why the X1 is only in the Pixel C and Shield TV. It’s more likely that the value just isn’t there. The Snapdragon 820-powered OnePlus 3T’s benchmarks are comparable to or better than the Pixel C despite having higher driver overhead, running the same OS, being 1/3rd the size, and having the added heat a cellular modem.

                                                        “And the Wii U was a 37 watt technology. The Switch is more powerful than that and does consume a fraction of it, likely 1/8. This is what ‘technology’ is about.”

                                                        What is this accomplishing? Did I ever say the Switch was weaker than the Wii U or less power efficient? No, I didn’t. I’m saying the Tegra line is less power efficient than modern PowerVR, Adreno, and Mali designs.

                                                        1. No, Mediatek does not ‘just license things’, they are an engineering company with a know-how not comparable to nVidia.
                                                          Know-how is everything, that’s why Nokia was still Nokia and Wiko is still Wiko.

                                                          Android OS isn’t comparable to Apple OS in terms of performance, independently by drivers.

                                                          So show me the technology comparable to the X1 from IT, show me that SoC you are talking about.
                                                          Why I haven’t heard about it? Why the market is just Qualcomm, Mediatek, nVidia if this is a competitive SoC?

                                                          The X1 is a tablet specific SoC, it isn’t favoured by manufacturers. In fact the Pixel C is a tablet.
                                                          Tablets manufacturers are building on smartphone SOCs, especially cheap ones (like Mediatek), or low quality Snapdragons. They don’t need high performance/cost of the X1. Still the nVidia Shield it’s pretty powerful even with the Android OS on board. That means it’s good technology. While Nintendo don’t carry that burden, nor we know about the exact specs of its custom SoC.

                                                          For what I know Pascal raised FP16 computation, that’s the improvement over the previous architecture plus the manufacturing process.

                                                          They aren’t getting rid of it, it was just a limitation of the development board. They aren’t getting rid of it officially. We don’t know it, so speculating on this is useless.

                                                          Value isn’t there for tablet manufacturers. It’s there for Nintendo? If not WHY they would have chosen it? Eventually they had a very good deal. nVidia probably needed to do a deal this round, and we should be all happy about it since it’s good technology, hardware and software wise.

                                                          If you compare a 16 nm SoC to a 20 nm Soc you are going to say it. But it’s an unfair comparison. Compare apples with apples.

                                                          1. “No, Mediatek does not ‘just license things’, they are an engineering company with a know-how not comparable to nVidia.
                                                            Know-how is everything, that’s why Nokia was still Nokia and Wiko is still Wiko.”

                                                            They’re a fabless semiconductor manufacturer like Qualcomm and Nvidia, but they don’t create their own IP with the exception of the WiFi stuff they got from purchasing RaLink.

                                                            If they need a CPU or GPU, they license designs from ARM and Imagination. Qualcomm designs their own SoCs, makes their own ARM-based Kryo cores and designs Adreno themselves. Nvidia either licenses reference designs from ARM or they license just the ISA for their Denver cores and obvious they design their GPUs on their own.

                                                            “Android OS isn’t comparable to Apple OS in terms of performance, independently by drivers.”

                                                            So since you made that claim, why don’t you prove it? What sorts of things about Android make its performance terrible for gaming that’s it isn’t comparable to iOS?

                                                            “So show me the technology comparable to the X1 from IT, show me that SoC you are talking about.
                                                            Why I haven’t heard about it? Why the market is just Qualcomm, Mediatek, nVidia if this is a competitive SoC?”

                                                            Imagination doesn’t sell SoCs! Are you slow? They’re a semiconductor intellectual property licensing company. They offer a SoC design service but they themselves do not make SoCs to be sold to smart phone and tablet vendors. Mediatek uses designs from Imagination. Their WiFI stuff uses MIPS and many of their SoCs use PowerVR. Their next big SoC uses a 4 cluster Powervr 7XT+ clocked at 820Mhz along with 10 CPU cores in 3 clusters with reference designs by ARM.

                                                            The reason you don’t see Imagination SoCs is because they offer SoC design services which will help people make SoCs but they themselves don’t sell SoCs. They do however, make chips for demonstation purposes, like the GR6500, and for use in their wireless speaker products.

                                                            Are you really claiming that ARM and Imagination have no “know how” when they’re the ones that Mediatek, Samsung, Qualcomm, and Nvidia license things from?

                                                            For more proof, here’s an example of an ImgTec GPU being used.

                                                            “The X1 is a tablet specific SoC, it isn’t favoured by manufacturers. In fact the Pixel C is a tablet.”

                                                            You’re blowing my mind. /s

                                                            “Tablets manufacturers are building on smartphone SOCs, especially cheap ones (like Mediatek), or low quality Snapdragons. They don’t need high performance/cost of the X1.”

                                                            Again they were just fine using the K1. For some reason, even the Chinese console the Fuze Tomohawk F1 uses a K1.

                                                            “Still the nVidia Shield it’s pretty powerful even with the Android OS on board. That means it’s good technology. While Nintendo don’t carry that burden, nor we know about the exact specs of its custom SoC.”

                                                            Again, why are you so convinced the specs that DF released are fake?

                                                            “For what I know Pascal raised FP16 computation, that’s the improvement over the previous architecture plus the manufacturing process.”

                                                            Yea, they official brought FP16 into their Tesla chips but it does them the same way TX1 does. Pascal literally inherited that from the TX1. Essentially, an FP32 core can process a Vec2 FP16 operation. For some reason they completely killed FP16 performance in their consumer chips and limited it to 1/128th the performance of FP32.

                                                            “They aren’t getting rid of it, it was just a limitation of the development board. They aren’t getting rid of it officially. We don’t know it, so speculating on this is useless.”

                                                            How do you know they’re not getting rid of it? You’re positive they didn’t remove it? They just released the download size of Mario Kart 8 Deluxe as well and it’s the same size as the Wii U version + DLC (7GB). ASTC doesn’t have a 4bpp mode so,at the very least, that game should see an 11% decrease in it’s size (6.2GB) if it used ASTC.

                                                            “Value isn’t there for tablet manufacturers. It’s there for Nintendo? If not WHY they would have chosen it?”

                                                            Like I said, the story going around is that Nvidia begged.

                                                            “Eventually they had a very good deal. nVidia probably needed to do a deal this round,”

                                                            Which you seem to agree was the case.

                                                            “and we should be all happy about it since it’s good technology, hardware and software wise.”

                                                            Unless they picked AMD or Vivante, we would would have had good technology regardless. Nvidia isn’t the king of mobile GPU hardware, that’s why they dropped out of the mobile phone SoC market a few years ago and is moving away from 15w TDPs with the TP1’s successor.

                                                            “If you compare a 16 nm SoC to a 20 nm Soc you are going to say it. But it’s an unfair comparison. Compare apples with apples.”

                                                            You thought it was completely fine comparing the performance of chip with a giant heatsink on it to one that was passively cooled (TX1 vs A8X). Obviously, that has nothing to do with power usage, and Nvidia’s tests showed that with performance matched, the TX1 used 44% less power on average but their’s several problems wrong with that test if you’re judging PowerVR vs Geforce based on that.

                                                            1. We’re taking Nvidia’s word that they were seperating out GPU power usage from the rest of the A8X.

                                                            2. When a battery powered TX1 device eventually came out, it used a battery 24% larger than the iPad Air 2 but with only 17% more battery life in one test and about 6% less in other tests while only outperforming the iPad Air 2 by 10-15% in both Manhattan and T-Rex. However, the A8X still can’t claim to be nearly as performant per watt because the Pixel C still outperforms the iPad Air 2 by 90-92% in Manhattan 3.1 and 3.1.1.

                                                            3. Nvidia compared their newest mobile GPU design, which wouldn’t be released in any product until 5 months later, to a chip released 4 months prior based on a GPU design that was released 12 months prior. Three months before Nvidia did their test, PowerVR had already released the 7XT series which claimed increased performance up to 60% clock-per-clock core-per-core wih improved power usage, it just wasn’t released in any products yet.

                                                            When the iPhone 6s came out just 4 months after the Shield TV, it was the first 7XT based device released, on paper, should have been 36.7% less powerful than the A8X. Instead, its performance ranged from just 2% lower to 66% higher. Sure it was 16nm, but it didn’t need that to get a significant increase in performance per watt and it actually outperformed Imaginations own claims. The iPhone 6s also only got 25% less battery life than the A8X with a battery almost 1/3rd the size. Had Nvidia compared the TX1 to 7XT GPU configured the same as the A8X (ALUs/clock/nm), I doubt what they would have found would be worth showing off.

                                                            4. The TP1 is 16nm and uses Pascal which is just a die-shrink of Maxwell. The increased efficiency of the 16nm process allows it to clock 50% higher while using the same amount of power. You could assume that Tegra’s own efficiency gains moving to 16nm would allow it to again overtake PowerVR’s design, except that since the release of the iPhone 6s, Imagination has managed to further increase power usage per clock speed on TSCM’s 16nm process without large changes to the microarchiture which resulted in another 50% performance improvement.

                                                            1. Everyone license technology: Qualcomm license CPU from ARM and GPU from AMD, nVidia license CPU from ARM and cross-licence GPU technology with AMD and many others, etc.
                                                              All these company are engineering companies that builds their SoCs with their know-how.
                                                              nVidia is the best since has expertize in every market. Console too.

                                                              Well, if you aren’t unaware of it doesn’t mean that it’s wrong. I’m not going to make an expedition to lose time just to show this. Everyone knows that Android has bad performance on many levels and that iOS is more integrated and optimized with its specific hardware. Developers do confirm this everytime. But if you think otherwise, that they are comparable, than I’ll let you stay by your certainty. I’m not losing with this.
                                                              I can take time searching for debated things, not for granted ones.

                                                              Mediatek use their designs? Than they must be really weak knowing what Mediatek offers.
                                                              I would take nVidia expertize over IT any day, eventually Nintendo concur on this.
                                                              I will claim ->any dayBreath of the Wild<- that last 3 hours on a small battery.
                                                              It's not a smartphone so they don't need best smartphone's technology. It's a handheld that must be great. And it is. Proved by the software that actually runs on it. And I repeat, no one knows what's inside.

                                                              A8X IS 20 nm. I didn't compared 16 with 20 or 16 with 28. I compared 20 with 20. That's why nVidia made a direct comparison when they presented the X1 (that is not inside the Nintendo Switch, just inside the developer board).
                                                              nVidia did compare it directly and noted better performance, they did not compare it like you. It's obvious that a much greater performance correspond to more power conumption and the need of a bigger battery. Even if it's more performant they eventually can't do miracles.
                                                              I'm not going to question their data, Apple didn't (questioned) that.

                                                              I hope Nintendo Switch isn't 16 nm, so they can have an easy time implementing a future Nintendo Switch Pocket with that 16 nm or less manufacturing process to make it even smaller for those that need a smaller console. If it's 20 nm and can run Breath of the Wild for 3 hours for me is enough. If its 16 nm than maybe it's less impressive as an achievement.

                                                              1. “Everyone license technology: Qualcomm license CPU from ARM and GPU from AMD, nVidia license CPU from ARM and cross-licence GPU technology with AMD and many others, etc.”

                                                                Qualcomm doesn’t license their GPU’s from AMD. ATI created the Imageon for mobile phones and when AMD bought them, they sold the Imageon handheld divsion to Qualcomm in 2008. Qualcomm then turned that into Adreno, but the Adrenos of today are completely different from Imageons. The Imageons that Qualcomm bought had fixed function pipelines, by the Adreno 200 series they used a unified shader model with VLIW5, then they moved to a scalar instruction set in the 300 series, etc.

                                                                “nVidia is the best since has expertize in every market. Console too.”

                                                                nVidia is known for:
                                                                Desktop GPUs
                                                                Mobile GPUs (small market)

                                                                Found in:
                                                                Self-driving cars (SoC)
                                                                Some phones and tablets (SoC)
                                                                Xbox (GPU)
                                                                PS3 (GPU)
                                                                Switch (SoC)

                                                                AMD is known for:
                                                                Desktop CPUs
                                                                Desktop GPUs

                                                                Found in:
                                                                X360 (GPU)
                                                                XBO (CPU + GPU)
                                                                PS4 (CPU + GPU)
                                                                Gamecube (GPU)
                                                                Wii (GPU)
                                                                Wii U (GPU)
                                                                FM Towns Marty (CPU) :-P

                                                                ARM is known for:
                                                                ARM Instruction Set
                                                                ARM Low-power/Mobile CPUs
                                                                Mobile GPUS

                                                                Found in:
                                                                Almost every smart phone (CPU)
                                                                Many smart phones (GPU)
                                                                Routers (CPU)
                                                                WiFi/Blutooth controllers (CPU)
                                                                GBA (CPU)
                                                                DS (CPU + GPU)
                                                                3DS (CPU)
                                                                Wii (Security processor)
                                                                Wii U (Security processor)
                                                                PS4 (CPU in Low-power SoC)
                                                                PSVR (CPU in low-power SOC in breakout box)
                                                                Vita (CPU)
                                                                Switch (CPU)

                                                                Imagination’s known for:
                                                                Mobile GPUs (largest by volume just a few years ago)
                                                                MIPS Instruction Set
                                                                MIPS Low-power/Mobile CPUs

                                                                Found in:
                                                                Routers (CPU)
                                                                WiFi/Blutooth controllers (CPU)
                                                                PlayStation (CPU)
                                                                Nintendo 64 (MIPS for CPU + GPU)
                                                                Dreamcast (PowerVR)
                                                                PlayStation 2 (MIPS for CPU + IO)
                                                                PSP (MIPS for CPU + GPU)
                                                                PlayStation Vita (PowerVR)
                                                                Renesas’s self-driving car platform (PowerVR)

                                                                If you’re gonna brag about a company’s prowess in video game consoles, don’t choose the company whose chips/designs were used in the fewer game consoles than anybody else besides Qualcomm.

                                                                “Well, if you aren’t unaware of it doesn’t mean that it’s wrong. I’m not going to make an expedition to lose time just to show this. Everyone knows that Android has bad performance on many levels and that iOS is more integrated and optimized with its specific hardware. Developers do confirm this everytime. But if you think otherwise, that they are comparable, than I’ll let you stay by your certainty. I’m not losing with this.
                                                                I can take time searching for debated things, not for granted ones.”

                                                                So you can’t prove it then? You’re just gonna make claims and I’m just supposed to beleive you? Bullshit. If you’re making claims, then the burden of proof is on you.

                                                                “Mediatek use their designs? Than they must be really weak knowing what Mediatek offers.”

                                                                Oh boy. Mediatek uses Mali and PowerVR for different SoCs and are always CPU focused. For example, Apple uses a 192 core 7XT+ @670Mhz for their latest 16nm SoC while Mediatek is using a 128 core 7XT+ @820Mhz on their next 10nm SoC. That should still be a pretty good mobile GPU but they could have easily gone with 256 core at about 650Mhz on a 10nm process.

                                                                “I would take nVidia expertize over IT any day, eventually Nintendo concur on this.”

                                                                Well I hope nobody ever hires you to design a console. You choose based on brand names you know and love.

                                                                “I will claim ->any dayBreath of the Wild<- that last 3 hours on a small battery."

                                                                BotW is a gorgeous game but it's because of the artistic prowess not technical prowess.

                                                                http://www.technobuffalo.com/wp-content/uploads/2016/06/Legend-of-Zelda-Breath-of-the-Wild-Screenshots-01-1280×720.jpg

                                                                There are at least a few modern mobile GPUs that can handle BotW using less power. You're already seen better graphics at a higher resolution and frame rate on Mali GPU yet you still refuse to acknowledge that.

                                                                "It's not a smartphone so they don't need best smartphone's technology. It's a handheld that must be great. And it is. Proved by the software that actually runs on it. And I repeat, no one knows what's inside."

                                                                So developers and Foxconn have no idea what's inside it? Yea right, dude. And I'm not claiming it's a smart phone. I don't think it needs two high resolution cameras and a finger print scanner and certainly don't think it needs a cellular SoC, but it is a mobile device that's primarily supposed to be home console so it better be the best performance per watt you can get.

                                                                "A8X IS 20 nm. I didn't compared 16 with 20 or 16 with 28. I compared 20 with 20. That's why nVidia made a direct comparison when they presented the X1 (that is not inside the Nintendo Switch, just inside the developer board)."

                                                                Nvidia made the comparison to the A8X because they viewed it as having the most powerful GPU at the time, not because they were both 20nm.

                                                                "nVidia did compare it directly and noted better performance, they did not compare it like you."

                                                                They used Manhattan and T-Rex from GFXBench to judge its performance just like I did! The Nvidia representative even said that they were using power usage during the Manhattan benchmark to compare them.

                                                                "It's obvious that a much greater performance correspond to more power consumption and the need of a bigger battery. Even if it's more performant they eventually can't do miracles."

                                                                Except the whole point of those tests was to show that the TX1 has much higher performance per watt. They literally said it scored twice as high normally and it uses 44% less power when underclocked to the same peformance, yet without the benefit of a heatsink, the Pixel C's performance benefit over the Air 2 is only about 15% and doesn't have significantly better battery life especially when you consider that the TX1 has the advantage of a 64-bit LPDDR4 interface instead of a 128-bit LPDDR3 interface. Just in case you didn't know, tile-based GPUs became popular in phones because they're really good at lowering traffic to external memory which lower power consumption because the memory controller and memory are used far less often. IMR's do almost everything in external memory. So even if Nvidia was able to isolate the power usage of JUST the GPU from an SoC that they didn't make, they would be hiding a decently large power draw of their own SoC: the memory controller. Like I said, I can't deny that the TX1 is more efficient than the A8X but we are taking Nvidia's word for a lot.

                                                                "I'm not going to question their data, Apple didn't (questioned) that."

                                                                Why would they? The TX1 wasn't in anything yet and Apple released the iPad Pro with A9X before the TX1 made it into anything battery powered.

                                                                "I hope Nintendo Switch isn't 16 nm, so they can have an easy time implementing a future Nintendo Switch Pocket with that 16 nm or less manufacturing process to make it even smaller for those that need a smaller console."

                                                                That would be another reason why Nvidia was a bad choice and that's why it's relevant that Tegra's aren't in smart phones anymore. Smaller handhelds would be roughly the same size as a phone. If Nintendo wants to release a smaller version of the Switch as a dedicated handheld, they would really have to rely on die shrink. This is also the reason why Nintendo should have designed a proper handheld first and then scaled up.

                                                                "If it's 20 nm and can run Breath of the Wild for 3 hours for me is enough. If its 16 nm than maybe it's less impressive as an achievement."

                                                                I'm assuming it's 20nm, too, because 16nm chip running BotW for only 3 hours would even surprise me. At the same time, I don't understand why anybody would go with 20nm at this point. TSCM had issues with their 20nm process but did much better with 16nm which is probably why Nvidia skipped over 20nm with their GPUs. There are also already lines of SoC's that have been 16nm for two generations already.

                                                                1. Exactly where I wanted to go, Qualcomm bought from AMD. Not their own technology, it’s borrowed technology and they upgraded from that to what became now a completely different technology (after 10 years technology change…). They should still pay royalties for AMD technology (and probably nVidia too, because the two are the leaders in 3D graphics and share many patents) and probably retained many AMD engineers.

                                                                  You aren’t just with nVidia:
                                                                  nVidia is in discrete graphics, embedded graphics, Xbox, PS3, Nintendo Switch, develops complete SoCs (both for smartphones and tablets), bought 3DFX (and its patents), worked with SEGA, Microsoft and Sony. It has plenty of know-how, second to none, except AMD that has x86 expertise too though has zero smartphone/tablet experience while nVidia has plenty of it.
                                                                  Already in the Xbox, PS3, many tablets and some smartphones it was a safe bet. They also had technology ready for it, Nintendo just wanted a customized version of it.

                                                                  Nintendo 64 GPU was developed by Silicon Graphics. They just licensed the CPU, and it wasn’t a MIPS but a NEC (customized MIPS, like Qualcomms CPUs are customized ARMs).
                                                                  IT just bought MIPS technology when it was already niche, in fact IT is a small company (still), a very fraction of nVidia. Nintendo isn’t interested in MIPS technology.
                                                                  IT has nothing technologically comparable to Tegra, they should have done it from scratch and designing a whole chip isn’t like customizing an already proven technology.
                                                                  Imagination Technology hasn’t been involved in any recent console. Also it was always licensed by NEC, Sony, etc. They didn’t directly engineered nothing.

                                                                  Don’t believe it, stay your way. I do not lose anything, I already know it.
                                                                  Ok, this is just a 9 months old quote pretty explicative: “I think the iPhone’s supremacy is a legacy of how mobile has evolved, and doesn’t necessarily dictate the future. The Android platform is moving forward (e.g. with the Vulkan API, which is the Android equivalent to Metal), and developers and middleware providers are taking Android much more seriously as a source of commercial success, so there is every hope that Android will catch up.”
                                                                  You can find dozens of this quotes criticizing Android subsystem/development tools, etc.
                                                                  Android developers develops with limits in place, they don’t exploit software/hardware like with iOS.

                                                                  Mediatek do engineer their SoCs then, IT don’t just give their design out and Mediatek rebrands. So you are saying what I was saying. IT is bubbles.

                                                                  And so they found better performance than A8X even using third parties unoptimized software: both Android and GFXCrap. As I said you can compare easily on the same OS. They did better with a lower performant OS and you still there critizing: just meh.

                                                                  You are talking crap. You still push on tile-based technology… it’s OLD technology, and nVidia uses it both in discrete and in mobile platforms. And they aren’t using even IT patents but Gigapixel’s ones (1990 old, bought by… you know: nVidia). Why do you insist on pointing out old technology?

                                                                  A9X use 16 nm technology. Isn’t comparable. And when nVidia will build their 16 nm chips (maybe) for the Nintendo Switch Pocket they will be probably even more efficient than A9X since nVidia is the top engineer. ^^
                                                                  hought this isn’t about numbers but about contract and whole environment. nVidia gave to Nintendo a whole environment. The best one because ->they are nVidia<-.

                                                                  There can be many reason for 20 nm:
                                                                  – Cheap?
                                                                  – Cheaply available in high volumes?
                                                                  – Already tested and without any design risks?

                                                                  Apple sells their technology at you for 3x real value, they can afford more expensive production.

                                                                  nVidia didn't just 'skipped'. For once it anticipated Apple's design by 9 months, and they aren't few in this world. So Nintendo already developed their console and games on latest technology. You need 4 years to make a console, you know. 2 years ago they did the X1, then customized it for the Nintendo Switch. Knowing Nintendo it should have been 4 years old technology, not 2 or even less. Production of the customized SoC started very late, mid 2016. It was already a risk for Nintendo to come so late with a system launching in million just 9 months after.
                                                                  This isn't a smartphone that rehash every year, this is a COMPLETELY NEW SYSTEM with not just drivers but an entire new ecosystem.

                                                                  1. “Exactly where I wanted to go, Qualcomm bought from AMD. Not their own technology, it’s borrowed technology and they upgraded from that to what became now a completely different technology (after 10 years technology change…).”

                                                                    By that same argument, any knowledge Nvidia gained from 3DFX doesn’t matter because they didn’t do any of that.

                                                                    “They should still pay royalties for AMD technology (and probably nVidia too, because the two are the leaders in 3D graphics and share many patents) and probably retained many AMD engineers.”

                                                                    Are you really suggesting that Qualcomm should pay AMD and Nvidia royalties just because you like them so much? You know who they do/did pay licensing fees to? Imagination. They licensed their display IP for atleast a short time.

                                                                    “You aren’t just with nVidia:
                                                                    nVidia is in discrete graphics”

                                                                    That’s what desktop GPUs would be.

                                                                    “Embedded graphics”

                                                                    Right, SoCs.

                                                                    “Xbox”

                                                                    Said that.

                                                                    “PS3”

                                                                    Said that, too.

                                                                    “Nintendo Switch”

                                                                    Yup, said that.

                                                                    “develops complete SoCs (both for smartphones and tablets)”

                                                                    Said that. Should have added that all there expertise didn’t seem to help them make anything that was wildly popular in those markets and they have since stopped trying.

                                                                    “bought 3DFX (and its patents)”

                                                                    And ImgTec bought MIPS and Caustics and Insigma.

                                                                    “worked with SEGA”

                                                                    And Imagination’s GPU was in the Dreamcast design that they eventually went with.

                                                                    “Microsoft”

                                                                    Same thing as saying “Xbox” because any time they worked together after that was because Microsoft bought one of their premade chips for something.

                                                                    “Sony”

                                                                    Same as saying PS3.

                                                                    “It has plenty of know-how, second to none, except AMD that has x86 expertise too though has zero smartphone/tablet experience while nVidia has plenty of it.”

                                                                    But I thought Qualcomm owes every bit of their mobile GPU prowess to AMD? Plus Nvidia and Qualcomm makes their own custom ARM cores, the Denver and Kryo cores respectively. They just license the instruction set for those and use ARM’s reference designs for the little cores.

                                                                    “Already in the Xbox, PS3, many tablets and some smartphones it was a safe bet.”

                                                                    You just repeated things again to make the list look longer. You can’t count PS3 and XBox 3 times each.

                                                                    “They also had technology ready for it, Nintendo just wanted a customized version of it.”

                                                                    Qualcomm, Imagination, and ARM already had the technology for it, too. Nvidia begged them because they were embarassed they weren’t in the PS4 and XBO.

                                                                    Honestly, how dare you claim that the most common names in mobile computing have no expertise in that market while Nvidia has all the expertise in the world. I’m not trying to say that Nvidia sucks or that they’re inexperienced, they just don’t have the best mobile SoCs.

                                                                    “Nintendo 64 GPU was developed by Silicon Graphics. They just licensed the CPU, and it wasn’t a MIPS but a NEC (customized MIPS, like Qualcomms CPUs are customized ARMs).”

                                                                    MIPS Technologies designed it. They were a subdivision of SGI at the time. NEC was just the ones who fabricated it. Seems you do know the Qualcomm does custom CPUs, but I don’t beleive licenses to make modify reference designs were available from ARM until recently. The Kryo’s just license the ISA, the rest is custom.

                                                                    “IT just bought MIPS technology when it was already niche”

                                                                    Yet when Nvidia bought the dying 3DFX, they were just adding to their “expertise”.

                                                                    “in fact IT is a small company (still), a very fraction of nVidia.”

                                                                    Yet, you think they should start making fabbing GPUs for vendors to buy for desktop cards.

                                                                    “Nintendo isn’t interested in MIPS technology.”

                                                                    Never said they were.

                                                                    “IT has nothing technologically comparable to Tegra, they should have done it from scratch and designing a whole chip isn’t like customizing an already proven technology. Imagination Technology hasn’t been involved in any recent console. Also it was always licensed by NEC, Sony, etc. They didn’t directly engineered nothing.”

                                                                    Do you know what engineering is? Do have any idea what goes into making designs for licensing IPs? In what world would company that designs GPUs, CPUs, DSPs, Encoder, Decoders, and RF controllers not be FILLED with engineers. What Nvidia’s engineers do is literally the same was what ImgTecs engineers do. They design a chip, send the design over to TSCM who actually put the chip together, and TSCM sends them back the chips. The difference is that Nvidia makes them in bulk and sells them to people while ImgTec makes them in smaller amounts for driver development.

                                                                    You’re acting like their differences in business model should accounts for their expertise.

                                                                    “Don’t believe it, stay your way. I do not lose anything, I already know it.
                                                                    Ok, this is just a 9 months old quote pretty explicative: “I think the iPhone’s supremacy is a legacy of how mobile has evolved, and doesn’t necessarily dictate the future. The Android platform is moving forward (e.g. with the Vulkan API, which is the Android equivalent to Metal), and developers and middleware providers are taking Android much more seriously as a source of commercial success, so there is every hope that Android will catch up.”
                                                                    You can find dozens of this quotes criticizing Android subsystem/development tools, etc.
                                                                    Android developers develops with limits in place, they don’t exploit software/hardware like with iOS.”

                                                                    Thank you. I went ahead and read the rest of the article. A lot of that I expected (fragmentation because of the sheer amount of devices), but they’re talking about something very different from benchmarks. Yes, they use the word benchmark in the article but they’re talking about game performance, games that are intended to run at playable speeds. Benchmarks are made to not be optimized. They’re made to stress things and they most are primarily made to benchmark Android devices. Issues other developers might care about concerning supporting older devices and different resolution don’t matter to them. They make something that runs with general optimizations and they put it out their for people to run. That’s probably why the Metal version of their benchmarks getting lower scores than the OGL version on new devices.

                                                                    “Mediatek do engineer their SoCs then, IT don’t just give their design out and Mediatek rebrands. So you are saying what I was saying. IT is bubbles”

                                                                    I’m not saying Mediatek aren’t engineers. I said they don’t make the GPU and CPU designs they use from scratch. Those are licensed but their’s still many other parts of the SoC that they’re completely responsible for.

                                                                    “And so they found better performance than A8X even using third parties unoptimized software: both Android and GFXCrap. As I said you can compare easily on the same OS. They did better with a lower performant OS and you still there critizing: just meh.”

                                                                    Hey, genius!! I DID NOT SAY THAT A8X IS MORE EFFECIENT THAN TX1. Why do I have to say that like ten fucking times? You are taking Nvidia’s word that they were able to seperate out GPU power usage from everything else on an SOC and you’re assuming their own numbers don’t mask higher memory controller power usage. I’m not disputing that it is a more powerful chip or that it’s less efficient, I’m suggesting that, without knowing more details about the test you can’t assume their testing methodology was completely sound. You’re also looking at a chips performance without factoring in cooling. I pointed out the the performance gap lessens when the TX1 is in a similar thermal environment. That means that even though it can’t reach the max performance it claims and is underclocking, it is still saving power as a result of the underclock.

                                                                    And again, GFXBench isn’t supposed to be optimized for Nvidia or Imagination, that would make the test itself unreliable. Lets not also forget that GFXBench recieves constant updates so that the results can be comparable.

                                                                    “You are talking crap. You still push on tile-based technology… it’s OLD technology, and nVidia uses it both in discrete and in mobile platforms. And they aren’t using even IT patents but Gigapixel’s ones (1990 old, bought by… you know: nVidia). Why do you insist on pointing out old technology?”

                                                                    I’m bringing up tile based rendering it’s an example of an architecture with on-chip memory which scales with clock speed and reduces memory requirements of external memory. You’re making me repeat myself again. Anyway, I’m glad you finally decided to look up tile-based randering but the claims that Nvidia uses tile-based rendering in their new chips is based on speculation from a test done by one guy, David Kanter, that a bunch of sites starting reporting. But even in the comments to his original article, people said they beleive it’s just clever scheduling. Even David says it is not tile-based rendering but tile-based rasterizing which doesn’t provide the same bandwidth advantages of a tile-based renderer. Other commenters also found AMD GCN GPU’s providing similar results in the test program David made.

                                                                    The difference is that tile-based rasterization is mainly their to make better use of caches, for the sake of delta color compression, and to take advantage of spacial locality, but it’s still an IMR. A tile-based renderer, in the case of PVR, first does vertex shading and caches primitives into a tiled parameter list which is written to main memory. Once done, each tile is pulled into a USC and each USC tries to render them from beginning to end before writing only the final result to memory. Of course, it’s not perfect and occasionally things need to be written to RAM prematurely and then read back in. You can think of it as being like a bunch of smaller GPUs each with a little bit of on-chip memory only instead of rendering the whole frame, it’s only responsible for a small 32 pixel by 32 pixel tile. The other thing that PVR does which makes it a tile-based DEFERRED renderer, is that it defers all fragment shading until after the z-buffer is created which illiminates all overdraw with opaque geometry a prevents unneccessary invocations of fragment shaders as well as unecessary texture fetches. It’s similar to an early depth buffer pass but quicker and without redrawing geometry.

                                                                    “A9X use 16 nm technology. Isn’t comparable. And when nVidia will build their 16 nm chips (maybe) for the Nintendo Switch Pocket they will be probably even more efficient than A9X since nVidia is the top engineer. ^^
                                                                    hought this isn’t about numbers but about contract and whole environment. nVidia gave to Nintendo a whole environment. The best one because ->they are nVidia<-."

                                                                    Except ImgTec already made huge performance per watt increases on 16nm since the A9X. The A9 and A9X were only the first 16nm chips that uses PowerVR, the A10 was the second and got an addition 50% clock increase while using less power and the A10X is expected to be releasing in the next iPad Pro in March.

                                                                    "There can be many reason for 20 nm:
                                                                    – Cheap?"

                                                                    I can understand that but I heard such bad things about the process that I imagine TSMC would rather give it an early grave.

                                                                    "– Cheaply available in high volumes?
                                                                    – Already tested and without any design risks?"

                                                                    I think 16nm has already matured as a process considering that 10nm chips are already starting to be made.

                                                                    "Apple sells their technology at you for 3x real value, they can afford more expensive production."

                                                                    But stores and console makers can also afford to sell things with lower profits because they make money off of game sales.

                                                                    "nVidia didn't just 'skipped'. For once it anticipated Apple's design by 9 months, and they aren't few in this world. So Nintendo already developed their console and games on latest technology. You need 4 years to make a console, you know. 2 years ago they did the X1, then customized it for the Nintendo Switch. Knowing Nintendo it should have been 4 years old technology, not 2 or even less. Production of the customized SoC started very late, mid 2016. It was already a risk for Nintendo to come so late with a system launching in million just 9 months after."

                                                                    A console's development doesn't start with the design of the SOC, it's starts with a plan and concept. Both the Xbox One and PlayStation 4 came out in November of 2013 yet their APUS feature CPU cores that weren't in products until just a few months before and the GPU architectures they used were from products that started shipping in January of 2012. There nothing that would have prevented Nvidia from somewhat concurrently developing the X1 and the SoC used in the Switch. Likewise, a custom SoC using Cortex A57 cores and a PowerVR 7XT GPU could have started production since at least November of 2014, implemented the changes in 7XT Plus along the way to help Gamecube and Wii emulation, and they would have inherited the manufacturing optimizations that Imagination worked on with TSMC. Chips for development kits could have even been available by the end of 2015.

                                                                    "This isn't a smartphone that rehash every year, this is a COMPLETELY NEW SYSTEM with not just drivers but an entire new ecosystem."

                                                                    Have you seen die shots from phone SoCs generation to generation or even the TK1 vs the TX1? They're completely different. Sure the different blocks are probably just implemental changes year to year but they need to be layed out completely differently.

                                                                    1. nVidia still enjoy 3Dfx patents, I presume Qualcomm still enjoy AMD ones if they bought them too. And there is cross-licensing on every product.

                                                                      Everyone is paying licenses for 3D graphics to nVidia and AMD since they have the patents. Even IT is should be paying them. IT should have their own patents and companies will pay for them but obviously they haven’t the same weight in the 3D department as the other two giants.

                                                                      MIPS is niche and its value is null (except patents… but since they got it for a few millions shouldn’t be much worthed), Caustics… never heard of it.
                                                                      If there were value the other giants would have bought it instead of that small that is IT.

                                                                      Sure, it was inside Dreamcast… as licensed by a TRUE manufacturer like NEC. Licensed.

                                                                      Certainly Qualcomm pays royalties to AMD for its 3D technology, not only to it eventually. If you want to be competitive in todays market you have to pay royalties to many companies, nVidia included.

                                                                      Denver isn’t inside the X1, nor inside that ‘custom Tegra’. Anyway it isn’t such good technology. A57 is more powerful.

                                                                      nVidia didn’t begged anything, nVidia provided good technology for the right price, I presume. nVidia provided a complete ecosystem with warranties… and no, no one could match it. But eventually developers talking wonderfully about it isn’t enough to pacify myownfriend. Or let him happy.

                                                                      No one else has expertize in the console segment nor tools, that’s a fact. Sure they have lot of expertize in designing smartphone chips, no one denied it.
                                                                      And no one is saying it’s the best ‘mobile’ chip in the world. Obviously for smartphones there are much better options. It’s another market with other needs.

                                                                      Nope, it was a CPU customized by NEC (cost-reduced). Obviously it retained all the technology of the MIPS one, like mostly do with ARM’s reference designs (nVidia included with the A57 I suppose).

                                                                      But nVidia actually bought patents for their 3D technology, MIPS is completely irrelevant to IT, except for some custom designs that likely no one would consider. nVidia still own 3Dfx patents that use in their products (they were base 3D graphics technology, like tile-based in the 1990, lot of things happened in 1990 are still there forming the base of their technology).

                                                                      Nope, IT just design, nVidia does everything. You will see nVidia, NEC, AMD in the chip serigraphy. It controls all the phase of the design and production. It just use third parties foundries like everyone do, even Samsung and Apple. You’re just underestimating the importance of a manufacturer like nVidia. They provide Nintendo the actual chip.

                                                                      Nope, benchmarks are developed like any software. If the OS is sh1t they are going to stress the sh1t, not just a piece of hardware. They don’t run in assembler, they talk with the software and the layers that covers that hardware.

                                                                      That’s obvious, as no one design from scratch, especially weak companies like Mediatek (big but with weak know-how). The least expertize you have the less you can customize, and nVidia got plenty.

                                                                      nVidia’s word is enough for me, they aren’t children, they know that their declarations can be fought in one day if wrong. No one said it. nVidia’s word is enough for me.

                                                                      Again: GFXBench is written with crap, and works with crap. If you fail to understand this point there is no reasoning. You can’t compare hardware on different OSes. That’s it. Especially if you already know that one is unoptimized and weak in that specific department (and you should know).

                                                                      3D graphics is about efficiency and everyone do cross-licence some technology to make their design more efficient. nVidia’s top performance with Maxwell with a given bandwidth shows that it has its technology to preserve bandwidth.
                                                                      AMD has surely its own technology too.
                                                                      But that’s not the point, the point is: what their technology is doing with say… 25 GB/s (as speculated)? A lot of sh1t on screen as showed by the Skirym demo (but any game showed until now). That’s the point.
                                                                      You can’t just talk ‘terms’ avoiding what actually the Nintendo Switch is providing on that TV screen.
                                                                      Terms are all bubbles. As I said earlier, by reading some technology definitions you can say just ‘wow’, than it was all related to a Mediatek and the ‘wow’ disappear.

                                                                      nVidia did that Nintendo Switch, it maybe doesn’t convince you but it runs Skyrim on a 7″ screen better than a PS3. It definitely convince me.

                                                                      1. I have seen those demo and already knew about them, do you really think they can compare to this?
                                                                        http://1u88jj3r4db2x4txp44yqfj1.wpengine.netdna-cdn.com/wp-content/uploads/2017/01/skyrim-2-930×518.jpg

                                                                        Actual in-game content, no CG. Though if you are still committed that A9X is more powerful, or even equal, to the Tegra X1 it’s nothing that would hurt me. There are so many religions in the world. ;)

                                                                        And: http://wccftech.com/nvidia-tegra-x1-benchmarks-apples-a8x/

                                                                        I’m not a Tegra’s zealot, I don’t gain anything by showing it’s superior performance (per watt, or anything).

                                                                        1. What do you mean by “No CG”? Everything a GPU draws is CG. Also it’s irrelevant because I was comparing two different GPUs running the same exact thing so it IS comparable. Honestly, if you’re trying to make it look like you’re not an Nvidia fanboy, you’re doing a terrible job.

                                                                          What you just linked to was an article comparing the Tegra X1 to the A8X not the A9X. In fact, it’s talking about the same exact test I mentioned a few posts ago. You’re right. There are many religions in the world and I beleive in none of them. The A9X IS more performant and that’s not “belief” its fact and its proven by the benchmarks I linked you to which you constantly choose to ignore. They’re literally scores from the same exact benchmark used in the article you linked to. It’s like you think the A8X and the A9X are the same or similar. The A8X uses a 8 cluster 550Mhz 6XT and the A9X is a 12 cluster 450Mhz 7XT. The latter is twice as powerful as the former.

                                                                          I’ll recap because you’re clearly confused by what I’m claiming.

                                                                          The TX1 is more performant and efficient than the A8X.

                                                                          The A9 is about on-par with the A8X.

                                                                          The A9X is more performant and efficient than the TX1.

                                                                          The A10 Fusion’s TOP performance is close to TX1’s TOP performance in the Shield TV. I say that because it’s the best way to fairly compare the performance of an active cooled chip vs a passively cooled chip without making up numbers. The TX1 tanks in performance when passively cooled which benchmarks of the Pixel C prove.

                                                                          You can’t compare FLOPS between different GPU architectures. You can look at them to try to gauge expectations for how powerful a chip is but you can’t assume one GPU is better than the other just because it’s theoretical perfomance.

                                                                          Half-precision floating point math IS useful in 3D graphics especially for fragment shaders.

                                                                          Nvidia’s mobile GPU architectures are less efficient than PowerVR, Adreno, or Mali. That’s why TX1 was never used in anything smaller than a tablet. Meanwhile PowerVR, Adreno, and Mali GPUs are all used in phones.

                                                                      2. Also the Switch is rumored to be 157/315 GLOPS when it’s not docked and 393/786 GFLOPS when docked.

                                                                        GPU manufacturers get those numbers by multiplying the clock-speed times the ALU count times two FLOPS per ALU (one MADD). So the Switch docked is 256 ALUs x 768 Mhz x 2 FLOPS = 393 32-bit GFLOPS x 2 = 786 16-bit GFLOPS.

                                                                  2. It’s also worth noting that that 4310mah battery still only gets the Switch 2.5-3 hours of battery life while running at 307.2Mhz. The iPhone 7 Plus is getting around 2.8 hours out of a 2900 mah battery. Yes the Switch still has a 12% larger screen but the screen on the iPhone 7 Plus has to drive 2.25 times as many pixels.

                                                        2. It shows that they have not alot of faith in the switch if they are pulling a stunt like that. The 3ds is fine it seeems to me nintendo are just doing what inafune did with mighty no 9. I will say this once i got fire emblem echoes and breath of the wild i think i’m leaving Nintendo due to their annoying and stupid choices and playing either retro games or just not play altogether.

                                                      2. Nintendo First Order Commander Quadraxis

                                                        ||2019 will be the earliest showing of a prototype, most likely only for the rest of High Command and the dreaded Titans…||

                                                      3. Really? I thought the synergy with the Switch would kill a 3ds successor.

                                                        Now i originally thought a big surpirse with Switch would be a 3D capability but that didnt happen and if you sell the Switch “dockless” then in theory it would be 220 or less or you could drop it to 200 soon when the overall system price drops… anyway, maybe there is still a 3DS market in the future. Gonna have to be very interesting to see how that works.

                                                            1. }{ Actually, the dock itself doesn’t do any boosting at all… Similar to a laptop, when the Switch is plugged in it simply tells the machine to use it’s full power rather than the limited power it uses duting portable mode… It really is a glorified hdmi cable, albeit combined with an ac adapter… }{

                                                              1. I know that. Never said it boosted, but it does “charge” it so far as power if I am not mistaken.

                                                                The major point is that they charge 89.99 for a dock separately, so the Switch without a dock, could cost less.

                                                                1. It doesn’t cost Nintendo anywhere near $90 to make the dock. I wouldn’t be surprised if $80 of that is profit. Still though, it would cost less without but it was made to need it.

                                                                  1. except, it doesn’t need it. That’s the point… and the switch as a whole doesn’t cost 299 (nothing costs what it is sold for)

                                                                    But if we go by that, it is possible that a lone switch, with no dock, could be sold for less… that is all I am saying. Not saying they will, just that they could. And because it will get a price drop…

                                                                    1. Oh wait, I forgot it just used USB Type C, so yea, you’re right, it doesn’t need it. Still the dock probably counts for very little of its MSRP. The whole package has way more profit built into it than needed which I don’t understand. If people didn’t think and know its overpriced, then I could understand but otherwise it’s just making people wait to get one.

                                                                      1. We will see, but instill think the price is not bad at all. Brand new system at 300 with full portability, two controllers (whether you like tem or not it still has two technically) and you can play on tv. Only thing that sucks is no game. Would have been njce. A botw bundle would have killed it… my guess is, bundles for holiday

                                                                        1. I don’t know. I guess it’s because I’m a tech guy but I don’t see the value in it. There’s really no role that it’s well-suited for. Yea, the games are gonna be good but Nintendo could make a classic on anything.

                                                                          1. You mean the value in the portable system that powerful? I don’t know what you mean by tech guy but from what I’ve the opening price point is right where i expected it.

                                                                            1. By tech guy I mean that I’m a guy who follows and is interested in tech. For example, I’m always looking up benchmarks and reading PDFs that give more depth information about GPU architectures or SOCs, I like looking at teardowns of products and seeing what their bill of materials is to see how much components cost the company, and I like reading  documentation that hackers release about a console like how each chip is connected and via what protocols.

                                                                              I’m gonna sound like a broken record here but, in this case, I’m not impressed by the power or cost of the Switch. It’s CPU is based on a licensed ARM core that’s been in phone and tablet SOCs since 2014 but the Switch has it clocked at around half the speed that most of those SOCs use it at. The GPU is likely 256 core, Maxwell-based, set-up to combine two 16-bit operations into one 32-bit operation like on the Tegra X1, and runs at a lower clock-speeds (docked or not) than it’s ever been clocked at before due to the size of the Switch and it’s lack of active cooling.

                                                                              We have benchmarks for passively and actively cooled X1 powered devices with its CPU running at up to twice the speed the Switch is and its GPU running at up to 30% higher than the Switch in TV mode. You can ballpark the Switch’s performance based on that and leaked clocks.

                                                                              The Switch in handheld mode would be about on-par with the iPhone 6 which came out in 2014, had half as much memory bandwidth, and used an SOC that cost $20 at the time. In fact, the materials cost + manufacturing of the  iPhone 6 back then was $200 and it had two cameras, two microphones and a cellular modem(which cost $33) which the Switch doesn’t have. Even the iPhone 6s, which had a 1080p screen, only cost $215.

                                                                              When docked, the Switch most likely underperforms the A10 Fusion found in the iPhone 7 and is on-par with the A9 found in the iPhone 6s. Those chips cost $27 and $22 respectively, have just as much memory bandwidth, and don’t have the cooling benefits the Switch has with it’s size and huge ventilation holes.

                                                                              Lets bring in a non iPhone example, the Galaxy S7. The S7 costs $255 to manufacture, has an HDR OLED screen with four times the resolution of the Switch, faster memory,  12MPixel and 5MPixel cameras, and performs similarly to the iPhone 7.

                                                                              Looking at all of this, it’s hard for me to look at the Switch as being some amazing, powerful, well-priced piece of hardware especially when I know that chip it’s using is probably cheaper than the X1 since the lower-clock speeds will improve yields and the chip will be smaller since things like the MIPI-CSI interfaces(the X1 supports 6 cameras while the Switch as none), ISP (there for the cameras), HDMI (Switch uses the dock for HDMI), and the four A53 cores (used for low-power processing which wouldn’t apply to gaming) will all be removed and the display pipeline and video encoder/decoder (all support 4k @60fps) can be downgraded.

                                                            2. I don’t think they’re willing to give up their profitable two-platform system just yet. Their handhelds have always been very popular, and they were fortunate to have the popular 3DS to combat some of their losses from the Wii U.

                                                              With the future handheld, they could focus more on longevity and portability (The Switch IS pretty big compared to the 3DS), and bring back dual screens, possibly 3D and whatever new features/gimmicks they can think of. As long as Pokémon remains on their dedicated handheld, they’ll push systems.

                                                                1. I previously thought we’d see the new 3DS successor in 2018 or 2019, but with this taken into consideration, 3+ years is looking more likely.

                                                                1. There would be no point to the Switch’s portability at that point and then people would wonder even more why they didn’t just make a traditional console.

                                                                  1. Yeah… I think this portable, the Switch, will prove to be a game changer. Sure, there are naysayers and such, and the “I don’t like handheld” crowd… but with so much being mobile these days, and the Switch being connected to some mobile apps via the phone which just about everyone has, this is the type of thing people didn’t even know they wanted… and for nintendo, they will straight set Japan on fire with it and that will be a good base. I think it will sell better than people think and the preorders are a good sign… however, the caveat there is that the WiiU had supposedly “strong” preorders as well

                                                              1. Switch is high end, 3DS is low end. Why it’s so difficult to understand?
                                                                High price -> Low Price, Complex Games -> Simple Games, Short Battery -> Long Battery, Joy-Cons -> Dual Screen, Very Big Screen Size -> Small Screen Size.
                                                                It’s so easy to understand they wouldn’t overlap. Why don’t you?
                                                                Hybrid vs. Handheld, there is space for all the two. In the distant future too.

                                                                1. And why exactly can’t a low end portable only Switch be released? The bigger 3DS’s all go for around $200 so why couldn’t that work instead of stupidly splitting development resources again? Look at Sony. The PSVita flopped hard so they now have everything behind the PS4. Why can’t Nintendo do the same with the Switch having all of their games in one package?

                                                                  1. Understand that games like those of the Switch can cost multi-millions to make, 3DS even less than a million. That’s reflected in street-price. Moreover the system itself cost much more: 299 vs 199. You can’t really compare them. And 3DS already hit a too high price level, maybe they can lower it the long run to gain even more customers.
                                                                    Better to play with two (definitely) different systems so they can extend their influence over people, they will catch them all. Whoops! ^^

                                                                    1. They’re not definitely different if they’re both portable. Not in the eye of the average consumer anyway. You do realize that even with HD development Nintendos games go nowhere near the $100 + million price that other studios go for right? And only indie games mostly go for sub-$1 million. I bet you the 3DS’s bigger games all cost more than $1 million.

                                                                      1. Switch costs far more, it’s far less portable (a big 6 inch screen), its games costs lot more and its battery last less. No, they are very different.
                                                                        Switch development can cost even ten times more than 3DS development, it isn’t directly comparable. Gran Turismo 5, a Ps3 game (supposedly a less powerful console than the Switch), costed 60 millions to make. Do you really think that a sub-million game platform can really take away the developers needed for a Switch game (supposedly a 10-15 millions budget)?
                                                                        Nintendo 3DS it’s just a cash cow, would have no sense to kill it.
                                                                        For example Pokémon on the Nintendo Switch shloud cost 30 millions to make instead of ‘2 millions’ (supposed, since it’s a big RPG game), why Nintendo should do that when with ‘2 millions’ they do sell 16 millions pieces of that?

                                                                        1. The original 3DS had the same battery life. You still haven’t given me a legitimate reason why they couldn’t just do a smaller more portable Switch. And they wouldn’t have to go crazy with Pokémon. Just make it smoother looking with the upgraded hardware. Remember how they did NSMBU?

                                                                          Also yes I believe the sub million platform would take the development from Switch just like it took it from Wii U.

                                                                          1. Because:

                                                                            A. It’s powered by a tablet battery hungry chipset, you can’t just shrink it. It necessitates of a big battery (4310 mah, by comparison 3DS one is 1350… it is double the size). Chipset do cost, battery do cost, the high definition LCD do cost.
                                                                            B. The 3DS has a dual screen, it seems popular in the handheld market, also it’s popular the clamshell.
                                                                            C. The 3DS is already selling worse than old cheap handhelds, they just need to lower it’s price point.
                                                                            D. With a superior hardware it would be necessary to upgrade all the 3D stuff in Pokémon, like they did with the 3DS version, otherwise it would look outdated in comparison to the other games on that platform.
                                                                            E. Games will cost more to customers, ergo less customers… and Nintendo want them all.

                                                                              1. But why you say so? You should not be worried about Your console since there is so much people attention on it. It is already working out, even in company with its little portable brother.
                                                                                Also remmeber that you can capture a customer with the 3DS and then make him interested in jumping to the Switch too. Maybe a baby would have money just for a 3DS but then he grows and it’s a fidelized customer that will play Your system too.
                                                                                I’m completely uninterested on the 3DS, I will buy just the Switch… but don’t feel at all worried about it, I welcome it since it expand Nintendo’s family.

                                                                                1. I’m talking about a 3DS successor. The current 3DS probably has a year left at most. A “true successor” could kill interest in the Switch. If they still wanted a dedicated handheld then the hybrid concept of the Switch is next to worthless.

                                                                                  1. That will be bought by those in search for that kind of experience (me not, nor you I suppose).
                                                                                    Switch is a home console, just with benefits.

                                                                                    1. The most interesting possible benefit would’ve been a unified library of handheld and console to go along with it’s hybrid concept. “Wow, I can play a game like Breath of the Wild on the same system as Pokémon!!!” It would’ve also eliminated the redundancy of making two versions of a series for each console like Mario Kart giving them more ability to work on other kinds of games. Maybe instead of two different Mario Karts, we can now get one Mario Kart and one F-Zero.

                                                                                      1. It’s still possible to port games to one system to the other if it’s of interest. Porting games it’s easy, if it’s rarely done it’s because interest on that from customers isn’t that high.
                                                                                        Who knows maybe we will see that in the future, though I’m not holding my breath, I’m looking for Switch quality games on that fantastic console, lower quality games can somewhat confuse the specific customerbase.
                                                                                        Surely we will see Pokémon games on the Switch too.
                                                                                        They gain money from all those Mario Kart versions, heh. It makes the Company richer and more able to open new teams or enlarge existing ones. If the spend 1 € they gain 2 or more.

                                                                      2. I knew Nintendo wouldn’t canabalize it’s 3DS cash Cow. Switch is a home console. They hope to sell it to console buyers.

                                                                        The 3DS is the official handheld. It’s successor will be the new handheld.

                                                                        They can’t afford to put all their eggs in one basket. Not with a still-controversial system like switch. They can’t count on a user base that is bigger than console and handheld combined that’s only supported by Indies and a handful of AAA launch titles.

                                                                        I would have loved for the switch to play 3DS games, but when we found out you can’t plug in 3DS carts, I had a feeling, of course them saying it’s a home console, and emphasizing the point kinda gave it away also.

                                                                        1. Perhaps… I respect the opinion, but I still think this is a conservative statement and approach… any “true” 3ds successor can be scrapped if the Switch sells extremely well. Then the tech will catch up and in 8 years another switch, no 3ds… why two pieces of hardware if this is the way nintendo is going?

                                                                          1. Two pieces of hardware, so if one fails, the company does not fail. Just my opinion of course.
                                                                            Still, I would like to play all my games on one console, but I can’t imagine it being easy to be “that dev” that puts out a graphicly inferior game on an HD console because it’s supposed to be a handheld. Pretty sure they would be scorned and mocked for making a “weak game.” You have different expectations developing on the 3DS than Switch.

                                                                            People expect big, solid games on switch. Games that are handheld may look half assed, whereas if it were on the 3DS, it would look fine. IDK. Just playing devils advocate.

                                                                            1. You just scrap that 3ds development once the 3ds finally runs its course (which hasn’t happened yet because it is a cool little handheld and they “New” one improved some of those things) but soon, they can just allocate all development to the Switch…

                                                                              I see your point, but I still think it all depends on Switch sales. 3ds still there and doing well… and hell, that system doesn’t really get AAA 3rd part y content, is nowhere near HD and does great… I mean, in some sense, it is totally logical why Nintendo does not let power/graphics dictate their decisions. Fans/consumers just don’t have the mental make up to be leading a company like this because you have to be patient, you have to see the big picture and you have so many variables… anyway, I like this Prez… all is on the table, he has a very good sense of the landscape in my opinion.

                                                                              1. Switch sales will decide many things, I can agree on that. Including 3rd party support, install base growth and the catch-22 that exist between the two dynamics.

                                                                                It may impact Nintendo’s handheld strategy. We shall see!

                                                                            2. Two handhelds cannibalizing each other’s sales and competing for development resources. Sounds like a terrible idea to me. Makes more sense to just release the Switch with a smaller form factor and be done with it.

                                                                              1. Switch is a home console by Nintendo’s definition . So everyone can keep calling it a handheld, but if Nintendo sees it as a home console, then it will not replace the 3DS.

                                                                              2. They don’t have the same specs and the developers don’t develop the game the same way they are two different markets with two different types of games.

                                                                                Now you folks know how I feel when arguing Nintendo’s dumb ass decision. I happen to love the N3DS, so this is good news for me, but I understand how it feels to have expectations then have Nintendo trample them. So I know how you feel, even if I don’t agree on this subject.

                                                                            3. If Nintendo wasn’t fucking stubborn as all hell and just put Pokémon on it it won’t fail. Also not every game on every system takes full advantage of the hardware. Games that are normally portable but mostly just given a clean HD shine would be fine. Going back to Pokémon it wouldn’t have to go hyper realistic with detail on the Switch.

                                                                              1. Aye. If Nintendo put Pokemon on Switch…and gave it another upgrade, like the difference between B/W and X/Y – Not an epic upgrade, just an incremental upgrade… switch would take off like wild fire, and they would pull in 3DS owners. I just don’t think it’s going to happen that way.
                                                                                Probably a duel release between the 3DS and Switch, with cross platform play. Maybe two Pokemon gens from now, while the switch is 60-70% through its life, they could combine platforms.

                                                                                Getting Fire Emblem over to switch is a start, but they really need Pokemon.

                                                                        2. It doesn’t matter if the user base isn’t bigger than console and handheld combined. Even with about 80 million consoles sold this gen Nintendo lost a ton of money. The Switch will be sold at a profit from day one. Even if it only sold about 50-60 million it wouldn’t be a failure if it actually made them money.

                                                                      3. No, switch is
                                                                        1) a home system that can be taken on the go
                                                                        2) a solution that is easy to develop for and covers the best nintendo ideas in one.
                                                                        3) ment to focus on nintendos future ideas going forward in gaming.

                                                                        If they do make a new system it well most likely
                                                                        1) use similar tech as the switch in terms of chips (nvidea has already confirmed this)
                                                                        2) the devise well be made to focus on being a 3DS/wii U hybrid with the goal being backwards compatibility for every older system using virtual console as its selling point (game and watch, famicom, nes, super famicom, super nes, gameboy, gameboy color, N64, gameboy advanced, gamecube, DS, DSI, Wii, 3DS, new 3DS, wii U.
                                                                        3) it well be made to complement the switch, by being a handheld that can connect to the TV, be easy to develop for (using the same tech chips as switch) and be more smartphone like (better cameras, 3D effect, oled screens, 720P of higher, multitouch on two screens, and probably pair with your phone.

                                                                        And if nintendo says “were thinking about” is almost always means they are already making it, probably well announce at E3 next year. And com out in 2019. Its launch games well almost all be virtual console or HD remakes (well probably have duel release for switch)

                                                                        1. I like that you have well rounded thoughts on it but it is hard to agree with some of it… This is a handheld that connects to a tv (The Switch) even if you say that it is a console you can take on the go, those things could be stated either way and still be the same… it doesn’t make sense (to me) to develop hardware so close to this new flagship console that you need to sell… and also to take game development resources away from this Switch while trying to get games made for that system, which might very well dry up those resources needed to keep games steadily flowing to the Switch.

                                                                          I will have to wait and see on that one and remain skeptical.

                                                                          1. Well the dev process would end up much like IOS, you dont really make a game for ipad and iphone, you just make a game for ios. Then latter maybe make tweeks for the difference. So were really talking about the development resource being a week of difference.

                                                                            Also this new device to replace the 3DS would be a virtual console focused device, new games (from nintendo) probably wouldent happen. The idea is to hold your library of old games, and maybe take new games from third party, and when attached to the tv, it becomes a wii U like device.
                                                                            This devices would be a handheld for old games, that can work on tv
                                                                            The switch is a home system that (for a very limited time) can become portable. Focusing on new games

                                                                        2. You’ve really hopped on the hype train, haven’t you? If it’s a system for on the go, why does it have shitty battery life? Why do you say it incorporates Nintendo’s best ideas when it’s actually an evolution of Nintendo’s absolute worst ideas from the Wii and Wii U (motion controls, poor third party support, under powered hardware). And what are Nintendo’s future ideas exactly? 1-2 Switch? HD Rumble? Even more gimmicks?

                                                                          Your idea is what a 3DS successor might look like is nothing more than a fanboy wet dream. What the hell does a 3DS/Wii U hybrid even look like? Gaming on three screens?

                                                                          This happens every time. People’s expectations for Nintendo’s next console soar astronomically high, fueled by unwarranted and unsubstantiated speculation and rumors. Then Nintendo does exactly what everyone should’ve predicted, and they release an underpowered, overpriced, gimmicky system that lacks adequate third party support. Then they fumble around on a few other things and give the general impression that they don’t give a shit about their disenchanted core base, opting instead to chase after the fickle casual crowd for a third outing. Good luck, Nintendo. I can see through the slick marketing, and what I see is a company without a sense of direction that’s at last gasping for air. I can all but guarantee that this is Nintendo’s last console, or whatever you call this Frankenstein of a thing. The fact that they relented and are making games for mobile now is the writing on the wall.

                                                                          1. I’m pretty much on the same page as you are, however I do think the switch will sell better than I originally thought, especially after the inevitable price cut. And in Japan alone It looks like it could be huge.

                                                                              1. I do it all the time. So you won’t find me throwing any stones at you! Haha. I prefer to consider myself Realistic rather than Negative… :D

                                                                              2. Well for the better part of 6 years my comments and predictions have bin spot on, so honestly im just trying to speculate how i know Nintendo to make moves……which again i have done very well
                                                                                I called the switch name, hd rumble, nvidea chips, a sandbox mario with a real would area, a zelda inspired by skyrim, the new 3DS, 3DS version of hyrule warriors and more.
                                                                                “a fan boys wet dream” dont be so jaded on video games, you sound like a hipster.
                                                                                Motion controls arnt bad, they can be good or bad, it just depends on the game. HD rumble is amazing for immersion , xbox did something similar.

                                                                      4. No, not in theory. The dock isn’t worth nearly the inflated price they are charging for it. There is very little real tech in it. It might bring the price down $20.

                                                                              1. I suspect if not this holiday, then later in 2018 I will be able to get it for $250 with a bundled game from a retailer willing to eat part of their profit margin to push games and accessories. ;D

                                                                    2. What’s the point in doing that? The Switch is fully portable, and it’s actually a pretty powerful system, let alone a powerful handheld. Why not dedicate handheld games to the Switch instead, and just flood the damn thing with games? I don’t think people would complain about playing games like Mario and Luigi, Pokemon, Kirby Planet Robobot and such, on the Switch. You can even keep the same level of graphic fidelity, nobody needs Kirby built on UE4. Just take 3DS games, give them 1080p 60fps and we’ll all be happy.

                                                                      Putting out a handheld weaker than Switch, but more powerful than 3DS doesn’t make any sense. Just put the games on the Switch.. and if for some insane reason it was more powerful than the Switch, well then, replace the Switch with it. I’m thinking this is just the backup plan for a potential Switch failure. There is no way in hell they release a new handheld if this thing starts selling like the Wii or the DS did. They’ll slowly phase out the 3DS family like they did the GBA family, is my guess. If the Switch fails, well then they will just make a 3DS 2.0 and go from there.

                                                                    3. Wow, really?

                                                                      Switch is screwed. Congatufuckinglations Nintendo. Just when I thought you were making all the right decisions.

                                                                      1. This could be a good decision, look the 3ds is probably the only reason why nintendo is making the switch because the wii u did abysmal. Their handhelds do great for Nintendo, always have. Nintendo The switch is a home console expirence on the go and from the ads it shows that. The next ds is sure to be a dedicated handheld not a console which is what the switch is marketing it as.

                                                                        1. And the Switch will now suffer from the same split development that killed the Wii U. Nintendo can’t support more than one console effectively anymore.

                                                                          1. Spit development was not what killed the wii u. The original wii sold really well and it had the ds on its side which sold 152 million handhelds but no one calls the wii a failure. the wii u had many problems but not because the 3ds was out along side it

                                                                            1. Remember how we kept getting 3DS focused directs with new first party games but just kept waiting and waiting and waiting to hear about Wii U games? Yeah, it’s about to happen again.

                                                                              Fuck Nintendo if they do this again.

                                                                              1. The reason why that was is because the wii u really died at end of 2015 and Nintendo was desperately trying to get rid of the wii u and were promoting the only thing that was making them money, the 3ds. If they decided to throw all their eggs in one basket into the wii u, then there could be a possibility that Nintendo would be a 3rd party dev now

                                                                                1. One reason they can make sure the Switch doesn’t fail is by putting all their games on it and making it the true hybrid console that it should be.

                                                                          2. That’s not true, they can do everything they want. It’s just a matter of money invested and they have plenty of money in their purse. Aren’t you already happy by the software line-up of 2017? They haven’t revealed everything but in 9 months we will play Zelda, Mario Kart, Mario Odissey, Splatoon and Arms. They can support the Switch if they want, they are doing it while developing games for the 3DS. Now. They just have to invest more money, it’s not a problem for them, they just need to be brave. And Kimishima certainly look like a bold one, isn’t it?

                                                                      1. No, they repeatedly have said that switch is its own devise and not ment to take over wii U, or 3DS. Remeber from the start they said NX well be a family of devices like IOS (iphone, ipad, ipad pro, ipod touch) switch is only the first NX device.

                                                                      2. Let’s say, Switch is a portable, home console… as long as Nintendo makes games for it and has made it 3rd party friendly…then things are fine, yeah?

                                                                        It has 100 games on the way, and strong pre-sales. Except for this hiccup with it maybe not getting 3DS/mobile games, Nintendo fans absolutely love everything else about it.

                                                                        So maybe it will be fine.

                                                                    4. I thought the whole point of the Switch was to merge both their console and handheld markets into one. Making the “4DS” or whatever it is will be a huge step backwards, especially if they make this thing years from now? The next DS would probably be able to be even more powerful than the Switch at a decent price, so then what? Make a Switch successor that’d be even more powerful than the next DS? I just don’t understand that, the whole thing about the Switch is that it’s a home console that is also portable, finally removing the frustration of having to have more than one Nintendo device and having to purchase the same thing countless times on both devices. I don’t see how the next DS can work, unless the next Nintendo Home Console, despite from being both a console and a handheld, the handheld would have dual screens like the DS?
                                                                      Finally, I just don’t understand what they mean when they say the DS market needs to be catered to. That’s supposed to be the huge benefit of the Switch, the fact that it caters to both Nintendo’s home console and handheld divisions. Just sounds like to me Nintendo’s only LOOKING for ways to cause more confusion.

                                                                      1. The point of switch is to replace the WiiU, and allow people to play console games together. Not to eliminate their line of handheld, so it seems.

                                                                          1. So you can play Mario Kart 8, Basketball and 1,2 switch, and splatoon together at each other’s houses! It’s in the commercial! If you can’t get online and chat right, you may as well push people into the same room so they don’t need either.

                                                                            Or… IDK what they’re doing. They expect people to go out and play together. They want people to have fun together like they did with the Wii.

                                                                            1. And it makes total sense if this was their one console. It has all the features of a handheld you can ask for. I just hope to God this is some kind of sly way of saying they won’t say that the Switch is the 3DS’s successor until later.

                                                                              1. Well, I could see Nintendo doing that. At the end of the day, we’ll have to wait and see what they do. I suppose anything is possible.

                                                                      2. Apparently the point is to get family and friends to play like they did in the Wii days!
                                                                        Let’s bring back the Wii days and let the good times roll! Except I don’t think that’s what the West wanted.

                                                                    5. He’s just saying that to make us all shut up about Switch replacing 3DS. If Switch fails they will depend on 3DS for income. If Switch is a huge success they will ditch the 3DS. We just got the “new” 3DS 2 years ago and so far it only plays ports of SNES and Wii games but I’m sure it has the potential to do more.

                                                                    6. I hope he abandons that Idea or something similar to the GBA successor happens (I.E: being dropped in favour of the ds).
                                                                      There are many reason why another Nintendo handheld is a bad idea, the majority already being listed in comments above.

                                                                    7. Sorry Everyone. Not only is the 3DS not going anywhere… but it’s getting a successor. I think I may wait and buy that.

                                                                      1. You may want to consider redirecting your rage at the actual person that deserves it not the people that are quoting Nintendo.

                                                                        how the f*** does it make any sense to get mad at somebody for calling the switch exactly what Nintendo is marketing it as? If Nintendo let you down be mad at Nintendo and voice your opinion.

                                                                    8. nintendos bread and butter is the handheld market. I’m adamant that Nintendo is in trouble with the switch. Perhaps not wiiu trouble but they will have their backs to the wall once again. Once the initial hype wears off and they have 5-6 million units out there the switch will experience an identity crisis. It’s very nature and design puts it straight into niche territory. It cannot provide the best portable experience and it will not be the best home console either. so it falls somewhere in the middle. Where does a consumer go if they want the best home console experience? Likewise where does that consumer go if they want the best portable experience? More and more this looks like another piece of Nintendo hardware that will cater mostly to existing fans. As a handheld purist myself, I can say with certainty that the switch will not replace my new3ds as my go to portable. This is a real concern for me. Perhaps kimishimas comments are in response to the people out there thinking the same thing I am, hopefully they see the writing on the wall. I love the switch in theory but it has some serious hurdles. Even if it’s a huge success how can it be a better portable with its huge size and weak battery.? It’s unlikely but if I buy a switch I would probably never take it anywhere outside the house.

                                                                    9. I don’t want ti have to carry a Nintendo Switch AND a 3DS successor to play the latest and greatest Nintendo handheld games on the go… When I preordered the Switch, I expected the future of Pokemon to be on THERE.

                                                                    10. If they create a portable that isn’t just a more compact Switch, then it’ll completely ruin the point of the Switch, which is basically a hybrid that consolidates Nintendo’s developers onto one system.

                                                                    11. Pingback: Tech Central

                                                                    12. Well… new roadmap..

                                                                      2017 > Switch
                                                                      2019 > Full New 3DS/4DS, 2 screens for compatibility, 3D is still a thing, maybe equipped with a portable VR device.
                                                                      2020 > Switch Pro. VR and more cpu/gpu power. As seen in the leaks. Will compete with PS4 Pro and more likely with Scorpio.

                                                                    Leave a Reply

                                                                    Discover more from My Nintendo News

                                                                    Subscribe now to keep reading and get access to the full archive.

                                                                    Continue reading