www. O S N E W S .com
News Features Interviews
BlogContact Editorials
.
AMD details Threadripper 1920X and 1950X CPUs
By Thom Holwerda on 2017-07-13 20:57:58

Last night out of the blue, we received an email from AMD, sharing some of the specifications for the forthcoming Ryzen Threadripper CPUs to be announced today. Up until this point, we knew a few things - Threadripper would consist of two Zeppelin dies featuring AMD's latest Zen core and microarchitecture, and would essentially double up on the HEDT Ryzen launch. Double dies means double pretty much everything: Threadripper would support up to 16 cores, up to 32 MB of L3 cache, quad-channel memory support, and would require a new socket/motherboard platform called X399, sporting a massive socket with 4094-pins (and also marking an LGA socket for AMD). By virtue of being sixteen cores, AMD is seemingly carving a new consumer category above HEDT/High-End Desktop, which we’ve coined the 'Super High-End Desktop', or SHED for short.

AMD is listing the top of the line Threadripper 1950X for 999 dollars, which gives you 16 cores and 32 threads, with a base frequency of 3.4Ghz (and a turbo frequency of 4.0Ghz) at a TDP of 180W (nothing to sneeze at). These are two quite amazing processors, and later next year, the pricing should definitely come down a bit so it's a bit more affordable for regular computer use as well.

Well done, AMD. Sure, we need to await the benchmarks for more information, but this is looking real good. I'm hoping this will finally start forcing developers - specifically of games - to start making more and better use of multicore.

 Email a friend - Printer friendly - Related stories
.
Read Comments: 1-10 -- 11-17
.
Games
By Alfman on 2017-07-13 23:05:37
Thom Holwerda,

> Well done, AMD. Sure, we need to await the benchmarks for more information, but this is looking real good. I'm hoping this will finally start forcing developers - specifically of games - to start making more and better use of multicore.



With graphics cards, developers can target very high end graphics and low end graphics simultaneously. The worst that happens on lower systems is a lower frame rate, lower resolution, or lower details. Crucially the game is still fundamentally playable on lower graphics settings.

Because of the way they're used, CPUs are different. More CPUs can benefit back end systems: smarter bosses and AI companions, more sophisticated physics, more characters operating concurrently, intelligent speech, far larger crowds, etc. However unlike with graphics cards, it may not be so easy for a studio to support cheaper CPUs without fundamentally changing the game play experience.


For example, a game might have AI teammates that use AI to learn from and work with you to get you through game obstacles, but turning them off the teammates could break the game.

Don't get me wrong, I'd be interested in seeing all the cool things they could do, but I suspect game studios will continue to target lower CPU specs so they can sell more games. IMHO 32 thread CPUs will remain very niche in the medium term, at least for consumer applications.
Permalink - Score: 3
.
RE: Games
By dark scizor on 2017-07-14 07:50:45
Couldn't the studio hardcode a requirement for more threads to be availlable to enable the more sophisticated features as replacements or supersets though?

They'll still have the R&D performed for their current title ready for their next ones.
Permalink - Score: 1
.
Now only...
By shotsman on 2017-07-14 07:57:50
if we could get a Hackintosh to use them...
That would be nice.

{wishful thinking indeed}
Permalink - Score: 2
.
RE: Games
By feamatar on 2017-07-14 08:29:52
I think it is not hardware resources that hold AI back, it is rather the low priority assigned to AI development and that it needs lots of iterations to get it right. The great FPS AIs of the past(Half Life, Halo, FEAR) used rather smart scripting and audiovisual cues to provide convincing and engaging encounters. Even in strategy games, smart scripting can make wonders, eg: Darthmod for Total War series.


Other areas, like your examples, scale rather well in my opinion and they are all related to graphical detail: the numbers of agents and the number of small entities all need heavy preparation on the CPU side before the GPU can render them, but their number do not break the game.
Permalink - Score: 2
.
RE: Now only...
By avgalen on 2017-07-14 08:32:58
That isn't wishful think at all. It might take a few months for the necessary tweaks but this will surely happen in the hobby-sphere

However, most people that spend a thousand dollar/euro on just the CPU are graphic professionals and Hackintoshes just aren't on their radar
The other people that spend this much on just the CPU are gamers and they are not interested in running macOS

(wishful thinking would be "if only Apple would sell macOS separately and allow it to work on non Apple hardware)
Permalink - Score: 2
.
RE[2]: Now only...
By mistersoft on 2017-07-14 11:16:30
If a future Apple CEO made the leap again to sell MacOS to all comers / PC builders - what is the (future equivalent) most they could sell it for..

Would Hakintosh builders fork out say $999 for an unrestricted MacOS ? (with e.g. built-in AMD plus Intel optimisations; but support for only a subset of GPU's and soundcards/chips ) ??

Less than $700 unit price and I don't think the board would let it happen.
Permalink - Score: 2
.
RE[3]: Now only...
By mistersoft on 2017-07-14 11:18:48
And I should be in the market for a new HEDT at the end of the year and TR 1950X vs i9 7900X is an enticingly difficult choice...
Permalink - Score: 2
.
RE[4]: Now only...
By grat on 2017-07-14 12:23:46
44 PCIe lanes (Intel) vs 64 PCIe lanes (AMD).

Considering that two graphics cards (x16 * 2) + 1 nvme SSD (x4) is already 36 lanes, I think Intel may regret only having 44 lanes.

Initially, the TDP (power draw) of the AMD chips is a bit scary, but it's "only" 11.25 watts/core, whereas Intel is at 14 watts/core.
Permalink - Score: 2
.
RE[2]: Games
By Alfman on 2017-07-14 17:32:01
teamatar,

> I think it is not hardware resources that hold AI back, it is rather the low priority assigned to AI development and that it needs lots of iterations to get it right. The great FPS AIs of the past(Half Life, Halo, FEAR) used rather smart scripting and audiovisual cues to provide convincing and engaging encounters. Even in strategy games, smart scripting can make wonders, eg: Darthmod for Total War series.

I was thinking of something far more sophisticated than those "script" examples to be honest, and with far more actors in play. The games we see today are rather limited due to the lack of CPU power.

I concede games can be fun without more sophisticated AI, but having enemies and artifacts that react to their environments with preprogrammed scripts (and animations) is extremely limiting too. In the real world there are potentially an infinite number of solutions and everyone's path is naturally different. But in video games, rather than forcing players to follow highly scripted actions, we could use lots of CPUs to calculate what is physically possible. We've all encountered obstacles in games where something should work, but doesn't because the programmers didn't anticipate it (or didn't have time to write scripts for it), more CPUs running physical simulations could fix that. Most game environments are still quite ridged for the same reason.

Most AI opponents are quite predictable because they're following their scripts, but it's not very realistic. Real humans can develop plans, in game characters should also be able to use their environments in intelligent ways that haven't been scripted. The same intelligence should apply to in game AI teammates. IMHO AI could really revolutionize gaming.

With enough processing power in the long term, AI could actually automate game design itself: world design, plot twists, music, voice acting, etc. That'll be close to the day machines take over in real life, haha.

Edited 2017-07-14 17:40 UTC
Permalink - Score: 2
.
RE[3]: Games
By feamatar on 2017-07-14 20:52:55
Scripting in AI programming does not mean that you script the scene(think about Call of Duty, or the daily patterns in open world RPGs), instead you script your AI system(this is easy to understand for strategy games, but the same happens with FPS AI). That is: you define behaviors and priorities, that a particular type of AI should follow. Like this AI is reckless or that other prefers close combat or that other teams up with others. This is all scripting.

But what counts as a good AI system and a good scripting environment is hard to define. For example a strong AI does not necessarily a good AI, because AI should be able to fail and fail in a human way, and that is hard to prepare for.

If the AI is too efficient, too good, the player will say it is cheating.
If the AI does not lose enough time, the player will say the game is too hard.
If the AI fails, it should in a way like a human does, but humans often fail in ridiculous ways, but the AI will blamed that it is too dumb.
Or imagine that the AI in the game decides that you are the least efficient in the team and the game plays itself without you.

So you want something game-like, not something realistic(like almost all of our games are, imagine a racing game where you go on the racing line for 60 laps without any overtake, or a shooting game where you guard a warehouse for 16 hours then killed by a single taliban fighter from 400 meters)
Permalink - Score: 3

Read Comments 1-10 -- 11-17

No new comments are allowed for stories older than 10 days.
This story is now archived.

.
News Features Interviews
BlogContact Editorials
.
WAP site - RSS feed
© OSNews LLC 1997-2007. All Rights Reserved.
The readers' comments are owned and a responsibility of whoever posted them.
Prefer the desktop version of OSNews?