Otherwise that one company with the fruit logo might overtake the big three. Unfazed after costing Microsoft $1 billion in development costs, VP of Epic Games Mark Rein, the developer of Unreal Engine, is once again pushing console manufacturers for “bleeding edge” technology:
“There’s no end in sight for what we can do with unlimited technology. So we’re always going to be pushing and I’m sure we’ll be pushing for more than is possible to give. But yes, we feel that’s kind of our duty. That’s what Epic is here for. This is why we did Samaritan and why we’re doing a really high-end demo in the room here. We really are pushing these guys, because if they don’t, Apple will go right past them.”
The Samaritan real-time tech demo he mentions was allegedly running on three overclocked GTX 580s when it was showcased last year, far more powerful than even most PC gaming rigs found today. Thankfully, Epic seems to have optimized the technology during a closed door showcase of Unreal Engine 4, since a $1,500 cost for graphics processors alone would cause most wallets to disintegrate.
Should Microsoft and Sony (Nintendo seems set already with the Wii U) cave in to Epic pressure, we may yet again see loss-leaders showing up for the 8th generation of consoles. While this would be an exciting development for hardware enthusiasts, our bank accounts on the other hand, will not be amused.
As for the threat from Apple… they don’t even have buttons!
Source:
CVG
and i want a yacht full of beautiful women
Developers should be pushing Sony, Nintendo, and Microsoft for better hardware. After all, the final hardware specifications determines what developers are going to be stuck with for quite some time. They may be expensive initially, but I think it’s a fair trade off when you consider the current generation of consoles. What ever the console manufacturer’s decide, I just hope they give the developers ample memory and GPU horse power to utilize.
I actually disagree with this since that the standard resolution for TVs is going to be at 1920×1080 or so for the next 5+ years at least (the 2k and 4k tvs aren’t even released yet…)
I believe that the current generation of graphics cards should be able to handle most games at 1080p at 30 fps pretty damn easily (GTX 560 or 6870 or better)
as long as as the next generation has a cpu/gpu combo that can handle 1080p at high settings now, it should be fine for most games…
my example isn’t exactly the best, but how the hell does my go 7900 gs gpu run most games at 30+ fps on medium 5 years ago and it can run most games at 30 fps now on medium… and my friend’s 8800 GT and 4850 is pretty much the same way but on higher resolutions and settings. (8800 gt w/ q6600 running bf3 at 30 fps + on 1920×1080 on medium settings lol)
so pretty much i concluded that if you can run a recent game at a certain resolution at a certain framerate at a certain setting, at the time of a GPU’s launch, you’ll most likely run about the same setting for like 4-5 years on future games…
I’m with xmachine since developers often miss their target resolution or make compromises to make that goal. This area isn’t certain as developers may want to push even more graphics but at the same time they become more familiar with the architecture. So, IMO, having powerful hardware aids in that initial push and helps developers especially if they want to push the system. Furthermore, people want to see a huge leap in overall graphic fidelity not just the resolution. A more powerful console will also benefit us PC gamers as less will be compromised in the porting process in terms of graphics since what they would be porting would be of higher quality from the get go.
I realize most cards today can run games maxed, but that’s because most of them are ports and they aren’t really going “out there” like Crysis once did. Sure you have your BF3 and Metro 2033, but in some regards they are still some compromises for the lower end. I don’t think these games are 100% pushed for the higher end. A higher end console will help set the base line for what is considered a standard.
However, I am not saying you are wrong, in reality it may not be practical and cost effective for them to implement higher end hardware. So, they may end up going with something like you suggested. This is just something I would like to see happen for developers and what not from my point of view.
I’m all for the improvement of graphics processing for consoles, but I want to know the cost it will have on consumers. I can see Microsoft and Sony going for it. Nintendo on the other hand, as much as I’d like to see them be a graphical contender, I don’t see them doing this. Rather, they will probably stay with their current Wii U build.
As PS3 has demonstrated by it’s shitty ass launch due to high prices (and lack of games to a certain extent) (PS3 launch was $500/$600), around $400-500 is the cap in console costs.
and for $400… current/next gen hardware is gonna probably boil down something around a retail’s worth of $200 worth of processor + 4 gb ram + $200 worth of GPU and they might be taking a loss initially since all the design costs for custom board and etc…
If they go with a current gen GPU that will still be a good step (pricing maybe around what you said and an initial loss may happen again). But, I believe the main problem with the initial release of the PS3 was the steep developer learning curve. A lot of developers including Valve weren’t quite fond of the hardware choices in which Sony had chosen.
Prices with new consoles (Nintendo being the exception) will probably be high. Maybe not as high as previous generations, but if they go with newer tech (either current or unreleased/custom hardware) I still foresee an initial higher cost for early adopters. Prices with probably settle after a year or two, but whenever a product like this comes out and demand is high, people will even go on eBay and start shelling out a lot of money (some even charged upwards of $800-1000).
people shelling out $1000 for new consoles/tech is common for every tech product… especially popular ones that’s got a artificially created shortage :/
I don’t see why the companies can’t literally just buy licenses for current hardware and just build a custom board/casing to fit it all in… gotta be cheaper than hiring a firm to design your own custom hardware… unless you’re trying to do what microsoft is doing… putting a medium/high end cpu with a medium end gpu onto one die…