Skip to content

Yoshiaki Koizumi Hopes Nintendo Switch Can Be The System That Bridges Both Handheld And Home Consoles

In a recent interview with TIME, Nintendo Director Shinya Takahashi and game designer Yoshiaka Koizumi spoke about the upcoming Nintendo Switch console.

When speaking about the life-cycle of the Switch and asked if it will resemble more of the company’s home consoles or handhelds, Takahashi advised on his perspectives from both sides.

“Certainly, we’ve designed Nintendo Switch in a way that it can be used by consumers in the way that best suits them. I think we may see that people who have bought a Nintendo home console in the past traditionally, they may treat Switch like a home console and buy it and use it for a long period of time.”

“Whereas people who have been traditionally Nintendo handheld gamers, they may buy the Nintendo Switch and then for example, if a new version were to come out later, then maybe they would decide to upgrade to that.” Takahashi continued, “there’s obviously a lot of different developments that we could look at from that perspective as well.”

Nintendo game designer Yoshiaka Koizumi expressed his opinions on the life-cycle of the Switch and noted that he hoped that the new console will be the system that bridges both home and handheld consoles.

“We’re hoping that Nintendo Switch will be a system that will be the constant in your gaming life,” he added,”whereas previously, you would play certain things on your home system and certain things on your handheld. Our hope is that Nintendo Switch can be the system that bridges both of those and becomes the constant system that you’re always using.”

The Switch has the capability to be played at home on the TV, or on the go with the handheld and Joy-Cons, whereas it’s predecessor, the Wii U, could be played on the gamepad without the TV screen, but couldn’t be taken too far away from the home console. It looks like Nintendo have their options open on the Switch’s future, and it will certainly be interesting to see what’s in store in the coming years.

Source

74 thoughts on “Yoshiaki Koizumi Hopes Nintendo Switch Can Be The System That Bridges Both Handheld And Home Consoles”

    1. My bet to see a major 3DS release in a year just got riskier! Lol. But I’m not changing my opinion that easy!! :]

    2. They hope. If the Nintendo 3DS will sink in 2018 it will be, otherwise there will be another handheld.
      But I’m open that a new handheld can share a shrinked Tegra chipset and eventually mantain a ‘Switch’ in it’s name. ‘Switch Pocket’, maybe? Must be seen if it will maintain the Joy-Con, that’s the biggest hurdle (and the meaning of that name eventually).

      They implied the new handheld though, so… ‘if a new version were to come out late’. We will first how the Nintendo 3DS react to the Nintendo Switch in terms of market share. If it mantains its marketshare there will be a 4DS eventually, maybe with nVidia technology too.

  1. It sounds like Nintendo hardware is up in the air now and they are at a crossroads.

    That said, unless they have an installbase boom the likes only the Wii has seen, people won’t be playing it “a long time,” since the tech is already behind.

    It’s exciting and interesting times for gamers. Microsoft is blurring the line of which gen we are in. PlayStation is basically the stable “control group.” And Nintendo is off doing who the fuck knows what.

    I’ll be happily gaming on the sidelines to see what happens next! :D

    1. Not way behind when you realize that the Switch is by far the most powerful piece of dedicated gaming hardware that you can take on the go ever released.

    2. The tech isn’t behind it’s actually a different use of tech. Since company b and c focuses on this company chose to focus on that. They all do things different and in the end they bring different ways to enjoy a pastime.

      1. Technically it is behind. There are chips in phones now capable of graphics on-par with or better than the Switch when docked and there’s been chips capable of beating its mobile performance since 2014.

        Tegra was actually the worst mobile SOC choice they could have made.

        1. But, this is a custom chip for the system by most if not all accounts. The numbers however dont mean much if the experience is smooth and consistent and the gameplay is topnotch

          1. Custom doesn’t mean it’s a completely original design that’s a departure from anything Nvidia’s done before. The Tegra X1 was a multi-purpose SOC designed to have a lot of features so that device manufacturers that use the chip for a number of different things though only Nvidia and Google were the only ones to use it so it was in a total of just two devices. Regardless, it was designed with support for up to six cameras, an ISP to support those cameras, support for multiple displays, had built in support for embedded DisplayPort and HDMI, and supported 4K 60fps decode and 4K 30fps encode of h.264 and h.265 video, and it had four  A53 cores. None of those would be useful for the Switch and it would just increase manufacturing cost of the Switch. Get rid of those and there’s your custom chip.

          1. We’ve had this conversation already. I gave you all the info, you gave me info that further proved my point. Either You don’t like looking stuff up or you don’t understand the info that’s right in front of your face. Either way, we’re not gonna get anywhere. 

            1. But your info were wrong. You are religiously following your beliefs, because of that we can’t get anywhere. Anyway, there is no demo out there for aPples that compares to pEars, even tech demos. Also your data on efficiency of the A8x was all wrong, and I have proven it to you, etc.
              Your last try was a benchmark, but you should know that a benchmark like that it’s optimized for some chipsets and some not, and it was also showing even a driver overhead double with the Nvidia chipset. Do you know that running hardware in two different settings is misleading? It’s not like having nVidia running in the same settings of AMD with Windows… You refuse to understand that 500 Gflops are more than 350 even with identical TMU and ROPS, etc. I do not need you to change religion, I just let you see real numbers (not sectarian), real demos, and real games. But you can continue to think like you want, we are in a free world.
              A9x has 2 cool things: CPU and manufacturing process. Nothing else. (and a good OS subsystem but the Switch OS isn’t Android, so…)

              1. Oh my god, dude! I said two or three times that the Tegra X1 IS more powerful and efficient than the A8X, but it IS NOT more powerful and efficient than the A9X! That article that you linked me to says nothing to the contrary of what I said.

                No, a benchmark is inherently designed NOT to favor one chip over another, that would make the benchmark invalid. If it was designed for one specific architecture then that would make it just a tech demo. And again, it’s the SAME benchmark Nvidia was using to show the TX1 superiority over the A8X. It’s also a benchmakr that Tom’s Hardware and Anandtech use.

                I’m aware that that 500 is bigger than 350, but you don’t seem to understand that GFLOPS ratings are THEORETICAL performance. They’re the number of FLOPS that a GPU COULD do if the design was infinitely efficient and it’s memory and caches were infinitely fast. You don’t even have to look further than desktop cards to figure this out. The GTX 1060 is a 3.8 TFLOP GPU with 192 GB/s bandwidth so why isn’t the RX 480 wiping the floor with it when it’s a 5.1 TFLOP card with 256 GBs of bandwidth? Well shit, maybe its because Nvidia has a more efficient design than AMD and we can’t just use it’s on-paper specs to judge them.

                What you are right about is that the X1 is on a 20nm process while the A9X is on a 16nm process and that does give it an advantage but the TX has something of an advantage as well. The A9X didn’t just move on to a newer process from the A8X though. The A9X was a 7XT part while A8X was a 6XT based part. Now this next part seems to confuse the shit out of you, but the 7XT improved performance by up to 60% with the same configuration. I’ll say that another way. A four cluster 7XT chip clocked at 500 Mhz connected up to the same memory, would be up 60% more performant than a four cluster 6XT at 500Mhz. Both would be 128/256 GFLOP parts but one would be much faster. So even if A9X were on 20nm again, there still would be an increase in performance though not as much as there was going to 16nm.

                The other thing you got right is that driver overhead might be factor. Wha- hold on a sec. I have the results from the OnePlus 3 and Pixel C in front of me and the Oneplus 3 benchmarks worse in the texturing, driver overhead, ALU, and tesselation tests, yet its getting 5051 in the T-rex benchmark when the Pixel C gets 4738. Why does is get 2906 frames in the Manhattan test when the Pixel C is only rendering 59 frames more? I mean they’re both running Android so what’s going on? It’s almost as if the tests that focus on only one specific aspect of a chip don’t matter as much as their ability to function as a whole and that Nvidia isn’t as efficient as you think it is.

                Oh and hey, I get your point about the iPad Pro using Metal while the Pixel C uses OpenGL. That’s a fair point. But wait, it looks like when you change the API on the iPad Pro results to OpenGL the score actually goes up. That’s right. T-Rex goes from 9046 to 9510 and Manhattan goes from 5079 to 5125. Seems that despite Metal being a low-level API that halves the driver overhead, the benchmarks are actually better optimized for OpenGL considering their are more Android devices out their than iOS devices. If you were the least bit concerned with researching any of this, then you probably would have noticed that, too.

                Also, the tests in GFXBench actually make sure that things render with a reverse painters algorithm which negates most of the advantage of the deferred part of PowerVR’s TBDR architecture. In real-world where games can’t always render from front to back, a PowerVR chip would get an additional performance boost by eliminating overdraw.

                Stop claiming I’m some fuckwit with a religous conviction. What “religion” are you actually talking about? Are you talking about Apple? Because I run Windows 10 on both my desktop and phone. I’m assuming you’re talking about PowerVR, and I am a fan of their designs, but this isn’t the first post where I told you that Adreno and Mali are also more efficient than Nvidia’s mobile chips. Is the religion “Nvidia hating” because I’m currently using a GTX 960. I’m assuming you’ll ignore all of that though because then you can’t find a reason to be dismissive of the info I’m giving you and you’ll have to actually look up stuff for yourself for change.

                1. Look, me too can post charts: https://cms-images.idgesg.net/images/article/2015/11/ipad_pro_3dmark_ice_storm_unlimited_graphics-100628785-orig.png

                  Though I will not do it because I know it’s not a correct way to compare chipsets on different platforms.

                  You instead posted a benchmark with 2.5x driver overhead penalizing the Tegra X1. So it isn’t only untrustable but even unbelievable.

                  And nope I don’t care a bit about Tegra, it’s because of this that I’m fair on those two. You instead ‘like’ the A9x as you admit making you not exactly super partes.

                  And I have an AMD card and you know that AMD is plagued by very bad drivers (though it’s cheap, I love it), plus PowerVR isn’t top architecture otherwise they would just propose this superefficient architecture in the big PC market, instead it just powered poorly old Intel’s CPUs.
                  They have good design? Certainly, like Qualcomm ones (AMD-derived). Still you can’t pretend it’s topping nVidia top expertize. And I repeat at nauseam, there is not a single cool demo showing its ‘wonderness’, not even by PowerVR itself. The one you posted was kind of ‘meh’, admit it.
                  And you know… Nintendo could have far easily bought some common chipset PowerVR based with far lesser royalties than an nVidia one if they wanted. They didn’t wanted…
                  You can still think that your iPad Pro is more powerful than the Switch if you like. :rolleyes:

                  1. You know, 3DMark scores are also available from 3DMarks site, right? And here’s what they say:

                    Shield TV / Pixel C / iPad Pro
                    SSEU Graphics test 1 : 33 / 29 / 37 fps
                    SSEU Graphics test 2 : 19 / 16 / 19 fps
                    ISU Graphics test 1: 288 / 253 / 305 fps
                    ISU Graphics test 2: 234 / 209 / 173

                    Looks like it performs a little better in the tests that run at 1440p and the first Ice Storm Unlimited test. So it looks like pushes more vertices but less pixels. That kind of makes sense since PVR generally never goes all-in on their fill-rate since they have HSR. That goes back all the way to the Kyro II.

                    http://www.anandtech.com/show/735/13

                    Anyway… I guess because I like something, that makes me biased? I like Nvidia too, so why aren’t I speaking in favor of them? I’m also a huge Nintendo but I’m clearly criticizing them. Perhaps I’m not biased and I’m just simply stating what most benchmarks at this point are showing. I understand what you’re saying with the driver overhead, but that test measures draw calls. The other benchmarks clearly do not have a shit ton of draw calls though or there would be bigger improvements between two APIs. Going from OpenGL ES to Metal increases the driver overhead scores by 2.2x in test one and 1.65x in the other, yet performance in T-Rex and Manhattan actually go down when you switch to Metal. You could attribute this to the tests being poorly optimized for Metal, which is clearly the case, but performance wouldn’t go down unless draw calls already weren’t a huge factor in the performance of that test. They would have had to fuck up royally to make a draw call heavy test tank dip in framerate using an API that it up to 2.2x as efficient at sending them. It seems weird to me that you’ll dismiss results because of that but will judge GPU performance based on measurement of theoretical performance.

                    And yea, the demo I posted wasn’t the prettiest thing in the world but it wasn’t the furthest thing from what we’ve seen on the Switch and it’s running at 4k 60fps. Not like we saw the TX1 running anything better looking than the Slingshot benchmark before the Switch was announced either, though. This demo looks rather nice. It’s not mindblowing, it doesn’t use Vulkan or ASTC, and I don’t know what generation of Rogue architecture it’s running on or any specs of the device but its running at 1080p 60fps and looks pretty solid.

                    The reason nobody makes a PowerVR graphics card probably has to do with fact that video card manufactures don’t manufacture the chips, Nvidia and AMD do and MSI or ASUS just purchase them. In order to make a PowerVR desktop GPU, the company has to be willing to layout and create a custom chip. As for Adrenos, the current ones are nothing like the Imageons that they acquired from AMD all that time ago.

                    As for why Nintendo went with Nvida, I’m assuming you missed this report about Nvidia being desperate to have a presence in the console market and possibly offering Nintendo a good deal to the point that Nvidia might be takin a loss.

                    http://wccftech.com/nintendo-nx-handheld-nvidia-tegrabased-soc-rumor/

                    Regardless of what you think about their performance. I’m sure we can atleast agree that going with an IMR without eDRAM was a poor choice for the Switch, right? I mean their’s no reason the Switch’s chip (Lets call it the TS1) should be able to run BotW at 1080p if it can run it at 720p just fine. When docked, its clockspeed goes up 2.5 times meaning the ALUs, TMUs, and ROPs are all being overclocked the same amount. Everything is scaled more than the delta between 720p and 1080p. The only thing that can possibly be holding it back is the RAM since it scales very little or not at all.

                    1. But that one is untrustable too, even if in the whole nVidia is better than the A9x it’s still untrustable because are two technologies on very different systems and, like any software, expected to be optimized for Apple. Also this one do not consider the terrible driver overhead caused by Android.
                      In reality, with a system optimized for performance, like Apple’s or Nintendo’s, A9x would fare even worse than this.
                      Run a card in Linux and in Windows and they will give you far different. Android is killing performance and you should know that. And since nVidia is still better as you can see from that bench, you can imagine what it will be a system optimized for it.

                      PowerVR has enough money to make that custom chip for discrete graphic card if they have that superior technology. They haven’t, simply as that.

                      Choice of the RAM is based on cost, Nintendo is very attentive to cost. eDRAM is faster, that’s it, but far more costlier. Though no one know how large are caches, etc. so maybe the ‘custom’ SoC is tuned in those area, who knows. Anyway the Switch should be somewhat cheap because it’s destined to have at least one price cut, and with a new version maybe two.
                      The point is, do you like what have they proposed in terms of games? Do you like the grphics in Super Mario Odissey, Zelda and Skyrim? If yes the graphic performance is enough, otherwise they should have done better. I like it, sincerely, I’m satisfied.

                      P.S.: Many developers do say that optimizing engines for eDRAM is more difficult than just put everything in the same chunk of memory, we heard that when the old consoles came out, remember? I would go everytime for the best performance but the two solutions always have explanations (cost, etc.). eDRAM is cool, i like it, but still know they have to appease everyone (developers, managers, etc.).

                      1. “But that one is untrustable too, even if in the whole nVidia is better than the A9x it’s still untrustable because are two technologies on very different systems and, like any software, expected to be optimized for Apple.”

                        3DMark is notably not optimized for Apple. It’s meant to be representative of actual games and they’ve known for awhile that their physics middleware uses a data structure that runs poorly on Apple CPUs as well as a data dependancy that prevents Apple’s CPUs don’t deal with very well. They even came up with a way to slightly fix that but they decided not to upstream that into the benchmark because then it wouldn’t be representative of that middleware anymore. I’m not sure how much this effects the graphics benchmarks, if at all, but it shows that they’re not into optimizing for a specific platform.

                        As for the driver overhead, we have one measure of driver overhead and that’s GFXBench’s driver overhead test and that creates an unrealistically high amount of simple drawcalls and creates just as many state changes. The iPad Pro’s result in 3x to 3.46x higher than the Shield TVs which is rediculous but that does not mean that the high-level tests recieve that much a benefit. The high level tests do a lesser amount of complex draw calls. Like I said before, we can tell this by changing from OGL to Metal. This causes the driver test scores to increase 1.7x in one test and 9x in the other. If those high level tests were draw call heavy then it would be difficult not to see an enormous framerate improvement when switching to Metal but we don’t. Instead, we see a performance decrease. To sort of rephrase my argument, if a 9x increase in the test doesn’t vastly increase performance in these tests on the same phone and OS, then why would a 3-4x difference between phones make the results incomparable?
                        You can even see that they use relatively few different meshes in the benchmarks. Manhattan has duplicate tanks, cabs, and helicopters in each scene with mainly post-processing, resolution, and alpha effects increasing in 3.1 and 3.1.1. Don’t get me wrong though. I’m not saying that APIs and driver overhead have no effect on things. They just don’t appear to in GFXBench. 3DMark is a different story.

                        Still, I guess we can’t really figure out how much driver optimization factors into things until a phone gets released with the Helio X30, but I am curious how much performance you really think is being robbed from the TX1.

                        Besides, there’s one thing you don’t seem to be considering enough. We can see in the Shield TV and Pixel C that without a headsink and fan constantly blowing hot air off of the X1, the performance dips by 23-29% and 76% (from Shield TV’s scores, 50% from Pixel C’s own results) in the long-term. The A9X doesn’t have the benefit of active cooling, though it’s not immediately apparent how much that’s needed because it only dips 2.5% in the long-term. At the same time, you can make the argument that the A9X is hooked up to RAM with a 128-bit bus so that’s helping it’s performance, but that would also increase heat output and power usage. That would have been a factor in the A8X comparison as well except both the TX1 and A8X had the same bandwidth. But with the TX using LPDDR4, it was able to use lower-power memory and smaller memory controller. The iPhone 7 or the iPad Pro 9.7″ would be more comparable in that respect since they both have the same bandwidth as the TX1. One is a 257/515 GFLOP while is 346/691 GFLOPs but both out-perform or are on-par with the Pixel C or Shield TV despite being passively cooled with less ventilation.

                        Yes, PowerVR does have enough to make a custom chip and they do for demonstration purposes but it seems they choose to just remain an IP vendor. The article I linked you to was the review for their last desktop GPU and that was critiquely praised for competing with graphics cards twice it’s price but there was no successor because StMicroelectronics actually made the chips and decided to close their graphics division. Perhaps the cards didn’t sell well?

                        I’m aware that eDRAM would have made the chip cost and that developers had trouble taking full advantage of the eSRAM on the XBO but I think that was mainly because it’s size. 32MB of eDRAM can only fit a 128-bit 1080p framebuffer with no anti-aliasing which is probably why 900p became such a popular option. Larger g-buffers would still be an issue though. However, if Nintendo wanted to make a system that scales between two modes, they should have considered the scalability of bandwidth. That’s why I think a tile-based GPU would have been ideal. You get most of the g-buffer bandwidth isolation of a large on-chip memory pool while only needing about enough memory on-chip for a few tiles which would require only 32KB per-tile for PVR and 2.8MB per tile for Adreno for a 256-bit g-buffer. The bandwidth of that tile-buffer would also scale with the rest of the chip and simple games could very easily scale up to 4K without having to worry about overflowing he on-chip memory. It shouldn’t be much a problem for development since the actual tiling is all hardware managed. There are things a dev can do to further exploit tile buffer but they can otherwise design the game as if there is only on large pool of memory. In PVR’s case, they also do hardware HSR which is quicker that doing an early z-pass with an IMR and I don’t beleive it halves the polygon count like it does with IMRs either.

                        Graphically, no, I don’t really like the graphics of Mario Oddysey. The stuff in the more fantastical worlds looks great but the city looks terrible. The lack of resolution and antialiasing mixed with the sharps, contrasty highlights on the square building ledges and traffic light poles make the image looks really dirty. Usually a lack of antialiasing won’t show through YouTube videos but it stuck out like a soar thumb in Mario Oddysey. And without having grass covering up the ground, the really low-resolution shadows stand out more. Keep in mind that these little artifacts dance around when in motion.

                        http://nichegamer.com/wp-content/uploads/2017/01/super-mario-odyssey-10-13-17-3.jpg

                        These images of Dragon Quest Heroes don’t look great either.

                        PS4
                        http://nextn-cdn.nextn.netdna-cdn.com/wp-content/uploads/2017/01/1701-18-Dragon-Quest-Heroes-II-Nintendo-Switch-PS4-01-600×337.jpg

                        Switch
                        http://img2.meristation.com/files/imagecache/inline/imagenes/general/comparativa_switch.jpg?1484818885

                        I’m not seeing anything in any of those pictures that excedes the demos I showed you on a technical level. On an artistic level they absolutely look better but he Dwarf Hall demo is pushing 60 million polygons a second at 4K on Android using OpenGL ES 3.0 without ASTC. It too would look quite a bit better using Vulkan, ASTC, and half-way decent art team. And I would argue that the library demo has texture quality and polygon count similar to Mario Oddysey just way more boring. It also has anisotropic filtering which Oddysey clearly doesn’t have considering the streets turn into a very flat grey very quickly.

                        Honestly, the Switch game that is the most visually impressive to me is Xenoblade Chronicles 2 but that looks like it’s having trouble keeping a stable frame rate and the video uploaded is 720p which I wouldn’t think much about if it weren’t for is supposedly coming out this year. I have weird feeling that it’s going to remain 720p when docked and just use the extra performance to maintain a stable frame rate and maybe increase draw distance.

                        I’ve heard some people claim that Arms look like a PS4 game but it seems like some of the crowd may be sprites and the ones that are 3D are low frame-rate.

                        1. I say that what I have seen in Skyrim for the Switch isn’t what I’ve seen from those demos that looks flat in comparison. Very tight spaces not even complex, and without any physics involved. Skyrim is simply incredible to think it’s working on a ‘tablet’. Odissey has it’s pluses in reality, I dont think it’s cheap technically wise. Anisotropic filtering isn’t that demanding nowadays.
                          And I reiterate, benchmarks on different platforms are stupids, and Android sucks in performance, you should know it. It’s not a fair comparison even if Tegra has the edge (because Android sucks in that department).

              2. Chips doing what exactly? Running a tech demo for 2secs is not the same as playing botw for 3hrs(battery) or playing it till infinity(connected to power).
                The phones can barely run anything without heating up… I’m currently using my iPad to type this and it’s already hot cos of a few mins playing JuggernautWars! So if the chips theoretically can do better don’t mean they can do better I a real world scenario. The most powerful Mobile phone cannot play botw without melting… so stop with the mobile chips talk…

              3. They had to keep it under $300! Sacrifices had to be made. A lot of sacrifices! Limited accessories, no bundled game, reduced drive space, older chip, but by God they kept it under $300.

                If you want to spend a couple hundred more on two games and another controller, well at least you got the system for under $300. (Because you’re not playing ARMS without another set of $70 Joycons you know)

                You’ve gotta look at the big picture!!

                1. I’ve said this in previous posts. The iPhone 7 has a chip with a better GPU than the Switch (that’s actually better suited for what the Switch does because it’s memory bandwidth scales up with everything else) and it costs Apple $27 to manufacture.

                  The Shield TV runs a Tegra X1 at a higher clock speed than the Switch and comes with a game controller and a TV remote. Both have touch pads and microphones and the TV Remote has motion controls. It sold for $200 in 2015 meaning it’s cost to Nvidia is probably about $150 or lower. The chip used in the Switch will likely be cheaper to manufacture than the TX1 since they can remove unnecessary components from the SoC making it smaller and cheaper.

                  Even if you factor in the added battery, LCD screen, and dock, the Switch would still cost Nintendo less than $200 to manufacture. 

                  For reference, here’s the bill of materials for the iPhone 7 and the Samsung Galaxy S7

                  https://9to5mac.files.wordpress.com/2016/09/cszmryovyaawd8n.jpg?quality=82&strip=all&w=702&strip=all

                  The iPhone 7 costs $220. If you get rid of the cellular modem SOC and the cameras, since the Switch doesn’t have those, you have a $166 device.

                  http://img.clubic.com/08381720-photo-ihs-galaxy-s7.jpg

                  The Galaxy S7 costs $250. We can’t factor out the cellular modem because it’s built into the SOC but you can get rid of the cameras which brings it down to $235 and that’s including a Super AMOLED display with 4 times the resolution of the Switch’s screen and faster memory.

                  Both of these phones also mount the memory chips onto the main SOC which costs more and both have things like microphones, barometric pressure sensors, a compass, finger print reader, NFC controller, and gyroscope which the Switch does not have.

                  As for the JoyCons, they really would just have a gyroscope, accelerometer, and bluetooth chip and some vibration motors with everything controller by a small microcontroller. Here is a bill of materials of the much more complicated Apple Watch for comparison.

                  http://i-cdn.phonearena.com/images/articles/185210-image/38mm-Apple-Watch-Sport-Bill-of-Materials.jpg

                  That’s $81. Lets remove the OLED screen, memory, user interface, box contents, power management (just not that complex) and lets halve the electromechanical stuff. That’s $25 and we still haven’t gotten rid of the Pulse sensor and ambient light sensor and the wireless chip includes Wi-Fi. If we assume each of these components is $1 then we’re at roughly $22 per JoyCon.

                  If they wanted to keep the price of the Switch low, they would have just sold it at $250

                  1. I’m pretty sure Nintendo priced it so they could cut the price within a year, and still make money. It’s $300 to recoup R&D costs most likely.

                    Just another reason I’m waiting.

                    1. They could have recouped them off of games sales though. The higher price of the system is just going to make it sell less. As we’ve seen from reports, 60% of Japanese gamers think it’s too expensive.

                      1. It’s going to have a great launch… all Nintendo systems do. How well they sell in holiday 2016, that will tell us something

            2. Can’t really be behind anything because it doesn’t compare to any other device. You can tell it’s behind one of the other home consoles, except they don’t feature portability. Can’t compare to phones either because phones aren’t gaming dedicated devices. Therefor, Nintendo has not just the best, but the only product of the kind. Whether it’s a “good” product is a different matter of course.

              1. I’d like you to meet the Shield Tablet. An Android gaming tablet from 2014 featuring a Tegra SOC, micro SD card slot, 1920x1200px 8″ LCD screen, HDMI out, two 5MP HDR cameras, 2.4Ghz and 5Ghz WiFi, stylus support including palm rejection, and a battery with 24% more capacity then the one used in the Nintendo Switch. The Switch does have a better SOC than the Shield Tablet but the Shield might be more powerful than the Switch’s handheld mode. Still, the Switch isn’t the only device of it’s kind. It’s just the only tablet that BotW is coming out for.

                https://www.amazon.com/NVIDIA-SHIELD-K1-Tablet-Black/dp/B0171BS9CG/ref=sr_1_2?ie=UTF8&qid=1486485598&sr=8-2&keywords=Shield+Tablet

        2. If it gets traditionally portable games, it will bridge. If it doesn’t, then it won’t.

          I’m mostly talking about the 3DS RPGs we’ve gotten used to; the ones from Square (Bravely Default), Fire Emblem (I’m pretty sure Switch IS getting a new Fire Emblem but I don’t know for sure because my internet cut out during that part of the Switch presentation and I don’t care about Fire Emblem), and above all, POKEMON.

          Really though, Switch will never replace the 3DS unless Nintendo ALLOWS it replace the 3DS by killing the 3DS. Personally I hope they don’t do that, because I’m not buying Switch and I have a 3DS, but honestly that’s what they have to do.

          1. 3DS is already like 6 years old. It’s time.

            Square Enid is already on board as is Atlus and Fire Emblem is already confirmed. Pokémon seems inevitable.

            1. It’s time to replace it with a cheaper technology that will make it a 149$ little nice handheld. But if they want to maintain the dual screen it will still mantain its 199$ price tag.
              They still need a market for the cheap (cheap hardware, cheap games).

              1. Not every game on the Switch is $60. They can and are releasing cheaper games for it and eventually the console itself will be cheap enough. Even at $300 though, if people understand and fall in love with the concept, it might not even matter as much as you think it would.

                1. Let’s hope for it, let’s hope for it. I would like a ‘just a bit better than 3DS quality’ game at 20$. Hope it will not be just donwlodable content, it depends on Nintendo this. But Fast RMX is going to be just downloadable, and it isn’t even a little project. Uhm.

          1. That was a mistranslation of a generic statement about future hardware. There’s an article on NintendoLife that I can’t link here for whatever reason.

                1. Great way to look at it. The Switch is the first test for the all-in-one system and if it translates to sales will determine if Nintendo keeps only a single flagship console with no secondary dedicated portable. If that happens the advancement in tech should allow for a considerable upgrade and warrant a higher cost. That now leads to the question of lifespan, even if popular, Sony and Microsoft will come out with more ful systems when, 2022? They have “partial” upgrades coming or did release right? Scorpio and something else. PS4 4K and the VR will keep those companies stable for a minute so things will align for those companies to offer new consoles before Nintendo might want to but if the Switch is popular as its own unique experience, maybe even with Microsoft and Sony “giving up” the mportable marlet to Nintendo. Sure that might only be partially how they see it as Sony and Microsoft have their hand in mobile as sell with bith hardware and software (phknes, tablets, etc). So in this way Nintendo might have their niche carved out. If it is as good as se hope.

                  1. *or Sony and Microsoft could release new consoles by 2020 and if the Switch tanks, then Nintendo will then, likely, make the dedicated portable successor to the 3DS because they can corner that market and not spend as much on making it anywhere near as powerful as the other home consoles and also not worry about releasing it too quickly. Ir not, who knows

                2. Wii U got killed by himself, the N3DS is innocent!
                  Bad GamePad, bad name, bad marketing, bad third party support, Nintendo not ready to thicken their first party teams and so releasing few games. Also they stopped making games and ported those directly to the Switch since the Wii U was a dead horse. It was just bad in the whole scheme.
                  Wii co-existed wonderfully with the NDS, wonderfully… and it ws a bigg success.

                  Please stop thinking that by killing handheld gaming the Switch will see roses. Nintendo don’t want to give you more than 1 game per month because in their (faulty) mind people buy 1 game per month and they don’t want internal competition from their own games, especially if those are high budget games.
                  If they kill the Nintendo 3DS they will have LESS resources for your games because they GAIN money from the N3DS hardware and games, and so liquidity to attend and expand to other markets. Actually N3DS made Nintendo floating and able to offer you the Nintendo Switch, certainly they didn’t found the resources in their deadhorse.

          2. i find a 3ds successor somewhat necessary. I don’t want dual screen gaming to die. Nintendo has done such a great job with it. Also, I don’t want all my games to be 60$. 40$ is great price for more budget title. i dont want to pay 60$ for super mario 3d land or legend of Zelda link between worlds. 60$ is more for breath of the wild or odyssey

            1. My feelings are somewhat mixed on this, the dual screens were put to great use when the DS was new, but it seems like anymore the touch screen capabilities are being used less and less and I hardly see myself paying attention to both screens as much as I used to with older games. It must be me taking it for granted or something.

        3. King Kalas X3 {Greatness Awaits at Sony PlayStation 4! Hopefully it will also await us at Nintendo Switch if Nintendo doesn't FUCK things up again!}

          If Switch is highly successful, this interview right here gives one the impression they WILL use the Switch to replace not just Wii U but 3DS, too. The development of a handheld successor to 3DS is most likely on a timed delay if so.

          1. I’d say it’s nonexistent which is good. It would only cannibalizes the Switch. All Nintendo needs to do is release Pokémon.

          2. I always thought the 3DS was going to be replaced by the Switch, it’s been around for a good years now, they just have to see how successful the Switch is before transitioning just like what they did with the DS.

          1. A Nintendo Switch Pocket wouldn’t be too bad actually. It’ll play the same games as the regular version so no chance of cannibalizing sales.

              1. My thought would be something like how the 2DS is a cheap model without 3D Switch Pocket would be a cheap portable only model.

        4. I’m calling it now……Yoshiaka Koizumi will and should be the next successor to being the president/CEO of Nintendo. He has similar characteristics to that of Satoru Iwata and he seems that he would be the best option to lead Nintendo in the future. Nintendo would be wise to do so

        5. This is good to hear. The main benefit of the Switch (from a dedicated Nintendo gamer’s standpoint) is that development will be unified, meaning a steady flow of 1st party content. That inevitably means that the 3DS goes.

          Nintendo will be reluctant to kill the golden goose until they are certain that Switch will carry the load, but you can be certain that the 3DS will be strictly family budget titles if the Switch surpasses the Wii U’s 13M by March 2018.

        6. Pingback: Nintendo Switch | The Restless Never Rest

        7. My only beef with the switch replacing the 3ds is the form factor. The clam shell design with dual screens is genius and superior in terms of portability, durability, and ease of use. It just so nice, convenient and easy to whip out the 3ds for a quick session then just close the lid when you need to end your game when the wife starts nagging or whatever else interrupts your session. Then open up the lid and resume exactly where you left off without having to pause, or enter a menu and shut down the screen for battery conservation. I do this multiple times in a span of a few minutes sometimes. You just can’t beat that level of convenience with a single open screen. Hands down the 3ds design makes for a superior portable.

        8. Pingback: Nintendo Thinks Their Guerrilla Marketing Campaign Will Sell The Switch – My Nintendo News

        Leave a Reply

        Discover more from My Nintendo News

        Subscribe now to keep reading and get access to the full archive.

        Continue reading