higher-level languages like Python/JavaScript/Haskell, which as hardware resources increase, game programmers will increasingly use
This is a common sentiment but I’m not sure it’s true exactly, at least not for AAA games (many indie games already use Flash or other higher-level technologies). As hardware increases, game companies also keep pushing the boundaries—better graphics, more players on a server, etc. Even with 10x current hardware you’d probably still want to implement in Eve Online server in C++ rather than Haskell. What I’d guess is more likely is that C++ will be replaced by something like Rust that gives high level conveniences while still allowing low-level control.
(Note I say all this as someone with minimal C++ experience and little desire to get more).
This is a common sentiment but I’m not sure it’s true exactly, at least not for AAA games (many indie games already use Flash or other higher-level technologies).
Just a matter of time. We’re already a long way from assembler, and the indie games represent the low end which will gradually eat the high end’s lunch.
Even with 10x current hardware you’d probably still want to implement in Eve Online server in C++ rather than Haskell.
Given Haskell’s excellent concurrency support, I’m not sure that’s true.
Just a matter of time. We’re already a long way from assembler, and the indie games represent the low end which will gradually eat the high end’s lunch.
It might be possible to move on with a “sufficiently advanced compiler” for Haskell or the like, but barring that I think game devs for performance-intensive games will still want more precise control over memory usage. I predict we’ll see widespread use of “functional C++” (eg Rust, or C++ 11′s functional features) before “low-level Haskell” (eg ???).
Given Haskell’s excellent concurrency support, I’m not sure that’s true.
As far as I know Haskell’s performance isn’t considered predictable/reliable (mainly due to GC and lazy evaluation) enough for games. If even multi-threaded C++ is still too slow for what you want to do (eg have 10000 people on one server in real-time), Haskell isn’t going help.
At one point Epic was looking at Haskell, but recently they’ve actually gone the other way, abandoning embedded scripting languages in the newest version of the Unreal engine in favour of doing everything in C++ (although I think that was as much about consistency as performance).
(By the way, as a Scala dev I’d personally rather write Haskell than C++, and C++ is one of the things keeping me out of the gaming industry, but personally I’m not optimistic).
barring that I think game devs for performance-intensive games will still want more precise control over memory usage
What is ‘performance-intensive’ is constantly changing. I don’t think that languages like C# or JavaScript which sometimes get used in game development these days have sufficiently-advanced compilers, but they still get used. (Although at least in the case of Haskell, we really do have the promised ‘sufficiently advanced compiler’ in the form of GHC and all the research put into optimizing lazy pure languages; I think the estimate I saw floating around somewhere was that a modern GHC-optimized binary of an ordinary Haskell program will run something like 1000x faster than the best that could be done in the early ’90s.)
If even multi-threaded C++ is still too slow for what you want to do (eg have 10000 people on one server in real-time), Haskell isn’t going help.\
Haskell’s pure functions, green threads, and STM are great for concurrency, so I think your argument may work in the other direction.
Although at least in the case of Haskell, we really do have the promised ‘sufficiently advanced compiler’ in the form of GHC
The “sufficiently advanced compiler” I was referring to is one that makes high level languages as fast as hand-tuned C++ (thus eliminating the need for said hand-tuning), not just one that’s faster than it used to be. Such a thing is probably possible but it doesn’t exist now or in the immediately foreseeable future. Things like precise control of memory layout can make an order of magnitude difference to performance or more.
Haskell’s pure functions, green threads, and STM are great for concurrency, so I think your argument may work in the other direction.
They might make it easier but they don’t make it faster which is currently the limiting factor for performance-intensive servers. Making it easier would certainly help—apparently the latest Battlefield game has a lot of bugs due to hard-to-diagnose threading issues in the client. But it wouldn’t be viable to write that game in Haskell due to GC and lazy eval, even if the basic performance was good enough which it probably isn’t.
edit: Also as far as I know it’s possible to avoid some of these issues in Haskell with careful optimization of the code, but of course the more you have to do that the less you benefit from things “just working”.
I’m not saying we’ll never see real-time performance-intensive apps commonly written in functional languages, I guess I’m just not as optimistic about it happening soon.
They might make it easier but they don’t make it faster which is currently the limiting factor for performance-intensive servers. Making it easier would certainly help—apparently the latest Battlefield game has a lot of bugs due to hard-to-diagnose threading issues in the client. But it wouldn’t be viable to write that game in Haskell due to GC and lazy eval, even if the basic performance was good enough which it probably isn’t.
What you can write determines how fast it will run. If you don’t have green threads, but must use OS-level threads, that’s going to be a problem. If you have to be constantly locking because of mutability and can’t use STM, that’s going to be a problem. And yes, correctness does matter so that’s a problem too.
Fast lock-free thread-safe mutable data structures (eg ConcurentLinkedQueue) have been written in languages like Java (and apparently even C++ but I’m less familiar).
I’ve worked with videoconferencing software written in Haskell. Realtime performance is certainly possible, though whether the industry will accept that is another question.
Videoconferencing uses fairly consistent processing/memory over time. The load on the garbage collector has low variance so it can be run at regular inteverals while maintaining very high probability that the software will meet the next frame time. Games have more variable GC load, so it’s more difficult to guarantee no missed frames without reserving an unacceptably high time for garbage collection.
Unfortunately perf isn’t the only roadblock here; middleware is a real problem too. Even if you write your game in Python, your AI, physics, and tree-drawing components were all written by somebody else, in C++. No matter how good your bindings are you have to do some data conversion every time you talk to one of those libraries, or else use C++ data types in your Python game engine.
That’s not to say that soft real time constraints and tight bounds on memory usage and so forth aren’t also hard problems, just that even if you have those things you still need the ability to carve out 12 bytes, put 3 4-bytes floats inside, and give it to Havok unaltered. Eventually people will port Havok and SpeedTree and such to your HLL (or write new better ones), but it’s a chicken and egg problem: no one will do that until there’s a robust market of studios using the language.
As someone working on AAA games, generally what happens is the engine is written in C++ (and small performance-critical sections possibly hand-coded using intrinsics or assembly) and some portion of the game logic is written in a scripting language. Lua is quite popular.
This is a common sentiment but I’m not sure it’s true exactly, at least not for AAA games (many indie games already use Flash or other higher-level technologies). As hardware increases, game companies also keep pushing the boundaries—better graphics, more players on a server, etc. Even with 10x current hardware you’d probably still want to implement in Eve Online server in C++ rather than Haskell. What I’d guess is more likely is that C++ will be replaced by something like Rust that gives high level conveniences while still allowing low-level control.
(Note I say all this as someone with minimal C++ experience and little desire to get more).
Just a matter of time. We’re already a long way from assembler, and the indie games represent the low end which will gradually eat the high end’s lunch.
Given Haskell’s excellent concurrency support, I’m not sure that’s true.
It might be possible to move on with a “sufficiently advanced compiler” for Haskell or the like, but barring that I think game devs for performance-intensive games will still want more precise control over memory usage. I predict we’ll see widespread use of “functional C++” (eg Rust, or C++ 11′s functional features) before “low-level Haskell” (eg ???).
As far as I know Haskell’s performance isn’t considered predictable/reliable (mainly due to GC and lazy evaluation) enough for games. If even multi-threaded C++ is still too slow for what you want to do (eg have 10000 people on one server in real-time), Haskell isn’t going help.
At one point Epic was looking at Haskell, but recently they’ve actually gone the other way, abandoning embedded scripting languages in the newest version of the Unreal engine in favour of doing everything in C++ (although I think that was as much about consistency as performance).
(By the way, as a Scala dev I’d personally rather write Haskell than C++, and C++ is one of the things keeping me out of the gaming industry, but personally I’m not optimistic).
What is ‘performance-intensive’ is constantly changing. I don’t think that languages like C# or JavaScript which sometimes get used in game development these days have sufficiently-advanced compilers, but they still get used. (Although at least in the case of Haskell, we really do have the promised ‘sufficiently advanced compiler’ in the form of GHC and all the research put into optimizing lazy pure languages; I think the estimate I saw floating around somewhere was that a modern GHC-optimized binary of an ordinary Haskell program will run something like 1000x faster than the best that could be done in the early ’90s.)
Haskell’s pure functions, green threads, and STM are great for concurrency, so I think your argument may work in the other direction.
The “sufficiently advanced compiler” I was referring to is one that makes high level languages as fast as hand-tuned C++ (thus eliminating the need for said hand-tuning), not just one that’s faster than it used to be. Such a thing is probably possible but it doesn’t exist now or in the immediately foreseeable future. Things like precise control of memory layout can make an order of magnitude difference to performance or more.
They might make it easier but they don’t make it faster which is currently the limiting factor for performance-intensive servers. Making it easier would certainly help—apparently the latest Battlefield game has a lot of bugs due to hard-to-diagnose threading issues in the client. But it wouldn’t be viable to write that game in Haskell due to GC and lazy eval, even if the basic performance was good enough which it probably isn’t.
edit: Also as far as I know it’s possible to avoid some of these issues in Haskell with careful optimization of the code, but of course the more you have to do that the less you benefit from things “just working”.
I’m not saying we’ll never see real-time performance-intensive apps commonly written in functional languages, I guess I’m just not as optimistic about it happening soon.
What you can write determines how fast it will run. If you don’t have green threads, but must use OS-level threads, that’s going to be a problem. If you have to be constantly locking because of mutability and can’t use STM, that’s going to be a problem. And yes, correctness does matter so that’s a problem too.
Fast lock-free thread-safe mutable data structures (eg ConcurentLinkedQueue) have been written in languages like Java (and apparently even C++ but I’m less familiar).
Also, STM isn’t necessarily much better than locks in practice eg quick Googled example: http://nbronson.github.io/scala-stm/benchmark.html (Don’t know how the Haskell equivalent compares)
(where “medium” granularity locks were just as good perf. wise and STM’s GC pressure was higher)
I’ve worked with videoconferencing software written in Haskell. Realtime performance is certainly possible, though whether the industry will accept that is another question.
Videoconferencing uses fairly consistent processing/memory over time. The load on the garbage collector has low variance so it can be run at regular inteverals while maintaining very high probability that the software will meet the next frame time. Games have more variable GC load, so it’s more difficult to guarantee no missed frames without reserving an unacceptably high time for garbage collection.
Unfortunately perf isn’t the only roadblock here; middleware is a real problem too. Even if you write your game in Python, your AI, physics, and tree-drawing components were all written by somebody else, in C++. No matter how good your bindings are you have to do some data conversion every time you talk to one of those libraries, or else use C++ data types in your Python game engine.
That’s not to say that soft real time constraints and tight bounds on memory usage and so forth aren’t also hard problems, just that even if you have those things you still need the ability to carve out 12 bytes, put 3 4-bytes floats inside, and give it to Havok unaltered. Eventually people will port Havok and SpeedTree and such to your HLL (or write new better ones), but it’s a chicken and egg problem: no one will do that until there’s a robust market of studios using the language.
As someone working on AAA games, generally what happens is the engine is written in C++ (and small performance-critical sections possibly hand-coded using intrinsics or assembly) and some portion of the game logic is written in a scripting language. Lua is quite popular.