The NES vs. its contemporary competition

Discuss technical or other issues relating to programming the Nintendo Entertainment System, Famicom, or compatible systems.

Moderator: Moderators

User avatar
rainwarrior
Posts: 8062
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: The NES vs. its contemporary competition

Post by rainwarrior »

I do think it is significantly easier to port amongst the same CPU. Much better than if they had a different CPU. Doesn't mean it's easy. Just easier. There are tons and tons of games that got ported between Amiga and Atari ST, for example. Even more recently, portability between XBox360 and PS3 was made a ton easier because they both had PowerPC architecture.

Now, if you want to find examples where code was shared, go looking. Maybe check out Paperboy on the NES and C64 and see if you can find the same routines in each? Might be tricky to compare, since relocation of code is going to change most of the addresses, but maybe an instruction comparison on trace logs might help find similar regions of code, if they exist. Probably it'd be a lot of work to answer this question properly; would anyone care enough to find out?
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: The NES vs. its contemporary competition

Post by tepples »

There was an Atari ST cartridge called Spectre that used ROM chips desoldered from a Macintosh Plus to run Mac OS.

But the I/O of the Genesis was so different from that of, say, the Mac that games were rarely ported between the two.
User avatar
OneCrudeDude
Posts: 275
Joined: Fri Aug 23, 2013 2:14 am

Re: The NES vs. its contemporary competition

Post by OneCrudeDude »

But didn't the PS3's awkward architecture, especially the infamous Cell processor, make multiplats suffer? They might share the same architecture, but the PS3 suffered a lot when it came to ports, probably because, at the time, the PS3 was a massive failure and Sony was bleeding money (they still are, but they haven't gone anywhere). They might've been wary about wasting the effort to make a good port for a potentially dead console. On that same token, the Wii U also shares a similar architecture to the PS3 and 360, and it too had several bad ports. On the flip side, it sees very few multiplatform titles, almost as if the Wii U came from a different universe that we cannot comprehend.

According to the NESDev wiki, the NES version of Puzznic uses an illegal opcode, and some believe they copied the core engine from, potentially, the PC Engine version. But as Rainwarrior said, who would care enough to check? That said, has anyone on here dabbled with, say, the PC Engine or Lynx? I know some guy is porting several NES games (specifically Mega Man) to the PC Engine, and I reckon that is largely due to both consoles being very similar and making the porting process comparatively simple.

@Tepples: Congrats on post 12345.
User avatar
rainwarrior
Posts: 8062
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: The NES vs. its contemporary competition

Post by rainwarrior »

OneCrudeDude wrote:But didn't the PS3's awkward architecture, especially the infamous Cell processor, make multiplats suffer?
I said easier, I didn't say it automatically makes your port into a perfect copy. There's still lots of work to be done. There were definitely some poorly handled port jobs to the PS3, and a lot of devs were more comfortable leading with the 360 and treating the PS3 as secondary, especially early in the PS3's life. As time went on, though, I think the situation got a lot better. Even in the early days I can't think of very many games that were on both 360 and PS3 that I thought were significantly worse on one platform. Most of the time if it was worse on one, it was in a very minor way. Not even close to the kind of difference you'd see in, say, a Genesis vs SNES comparison.
Sik
Posts: 1589
Joined: Thu Aug 12, 2010 3:43 am

Re: The NES vs. its contemporary competition

Post by Sik »

lidnariq wrote:By way of analogy ... the original Macintosh, the Genesis, the Neo Geo, the Amiga, the Atari ST, NeXT machines, really early Sun workstations (everything before the Sun 4), earlier Palm PDAs, and many other things, all used the 68000 or its descendants. Yet I don't think anyone would assert that it would be significantly easier to port a program from one to another solely because they use the same CPU.
Also many arcades used the 68000, but in many cases it was done in such a wasteful way it isn't really portable at all. E.g. it wasn't uncommon for Sega's arcade machines to have two or three 68000s, but only one of them really ran the entire game, the other ones barely handled a single aspect of the hardware that was pretty lightweight at that. I guess it could have been for copy protection, but honestly I suspect it was just sloppy programming (in arcades you could get away with just throwing money at the problem).
rainwarrior wrote:Now, if you want to find examples where code was shared, go looking. Maybe check out Paperboy on the NES and C64 and see if you can find the same routines in each? Might be tricky to compare, since relocation of code is going to change most of the addresses, but maybe an instruction comparison on trace logs might help find similar regions of code, if they exist. Probably it'd be a lot of work to answer this question properly; would anyone care enough to find out?
It's unlikely to happen, since back then ports were nearly always handled by telling third parties to port the game without any assistance whatsoever (not even the original binary, much less source code). Your only hope is when you know it was the same developer the one who ported it.
ccovell
Posts: 1041
Joined: Sun Mar 19, 2006 9:44 pm
Location: Japan
Contact:

Re: The NES vs. its contemporary competition

Post by ccovell »

OneCrudeDude wrote:has anyone on here dabbled with, say, the PC Engine or Lynx?
Not very much on the Lynx, but it is basically a tiny Amiga (or Atari 800) that happens to use a 6502: Framebuffer + killer sprite blitter + math coprocessor.

The PCE is an NES on steroids, as it was made as a direct response to all the limitations of the Famicom/NES, yet still be easy for ex-NES programmers to code for. It is not hard at all to step up from 6502 programming on the NES to 65(c)02 programming in assembly on the PCE.
User avatar
Bregalad
Posts: 8036
Joined: Fri Nov 12, 2004 2:49 pm
Location: Caen, France

Re: The NES vs. its contemporary competition

Post by Bregalad »

When you think about it, very little of a game program is about dealing with the hardware
In the case of the NES, it's not totally true. I found out at least half of my game's code is dealing directly or indirectly with the hardware.

For example the player AI code is not dealing "directly" with hardware but you're still going to poll buttons press from the NES controller, so if you port it to another system a full rewrite will still be needed. The same for code that deals with printing or displaying anything on screen, etc... You won't be directly by writing to $200x, but the way you prepare your buffer and screen alignment is still closely related with hardware.
Sik
Posts: 1589
Joined: Thu Aug 12, 2010 3:43 am

Re: The NES vs. its contemporary competition

Post by Sik »

The button example is not a good one though, because that one can be abstracted easily (although it's true many games that don't support remapping just use the joypad input directly).
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: The NES vs. its contemporary competition

Post by tepples »

Bregalad wrote:For example the player AI code is not dealing "directly" with hardware but you're still going to poll buttons press from the NES controller
Easy to abstract, as Sik pointed out. The code in one of my games to handle a button might look like this:

Code: Select all

  lda cur_keys,x
  and #KEY_B
  beq notB
  ; omitted: do_something
notB:
Would this be portable by changing the routine that fills cur_keys and the value of KEY_B for the new system? But if you were using asl or lsr to check bits in order by putting them in carry, you might have to stop doing that.
but the way you prepare your buffer and screen alignment is still closely related with hardware.
That and how the screen width affects level design. A lot of NES level design relies on the screen being 16 metatiles wide by about 14 visible metatiles wide. Super Mario games for Game Boy needed to resort to other measures, such as shrinking the graphics to allow use of a 20x16 metatile grid (Super Mario Land) or centering the camera in front of the player (Super Mario Land 2; Super Mario Bros. Deluxe). These are the same considerations needed when porting Master System games to Game Gear. Some jumps over long pits in Hello Kitty World for Famicom actually become blind jumps in Balloon Kid for Game Boy. Or consider The Great Giana Sisters. Because the C64 pixel aspect ratio is much narrower than that of the NES, each block has to be 24x16 instead of 16x16. And the C64's larger border means fewer metatiles across the screen (320px / 24px per tile = 13.3 metatiles), which is somewhere between the NES and Game Boy.
User avatar
TailChao
Posts: 11
Joined: Sat Oct 20, 2012 9:44 am
Contact:

Re: The NES vs. its contemporary competition

Post by TailChao »

ccovell wrote:Nor were most of the Lynx game developers, which was part of the problem...
The breadth of those problems could fill a sizable book.

Regarding resource sharing:
One thing that really impacts resource sharing is how the platform's memory is actually set up. For example, most consoles have execute-in-place media (i.e. code and data are executed and accessed directly from the cartridge), but home computers and actually the Lynx as well were setup to use disks.

On the PC-Engine I use a small mapper to bring the work RAM up to 136KB and accessible ROM up to 8MB. That allows for expansive and modifiable stages, large sound data, etc.

On the Lynx there is 64KB of RAM available, and 16KB of it is lost immediately to two framebuffers. More is lost if you are doing buffer feedback effects or rendering to textures. Your engine / drivers / textures all have to live in the remaining space and still have room left for stage data. While I can access the cartridge, it is extremely slow and byte by byte (it even requires seek times). But that is the only way new data can get in.

In Zaku, all resources for a stage were loaded upfront. The only new data loading during play were music tracks (and that was only once on a track change). This made the stages extremely small and also linear. In what I am working on now, data are loaded during play. This includes new stage data, PCM streaming, and music data. This means that object data (enemies, etc) are being swapped in and out dynamically, which is a huge headache for stage design especially when you want the game to feel fluid.

Large, dynamic stages on the Lynx are extremely difficult. Moving to the PC-Engine (or even the NES), they're significantly easier. But there are other limitations on these platforms. A variable width font renderer can be written in minutes on the Lynx because it uses framebuffers, but it is much more work on the NES or PCE because they use tiles (and have finite bandwidth for moving these tiles in and out of VRAM).
Point is, the 6502 was in so many platforms with so many different ideas regarding what games' requirements are. Yes, you can share some code. But especially nowadays when the only development on these dinosaurs is for the sake of explicitly targeting their unique features or for fun, there is not much point in trying to compromise on performance or game design just to reduce keyboard activity.
User avatar
OneCrudeDude
Posts: 275
Joined: Fri Aug 23, 2013 2:14 am

Re: The NES vs. its contemporary competition

Post by OneCrudeDude »

If I may interject for a moment, wasn't Hello Kitty World an NES port of Balloon Kid that came out one year or so later?

And seeing how the PCE was an NES without the biggest limitations, I'm a bit saddened that it didn't do so well in the market.
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: The NES vs. its contemporary competition

Post by tepples »

TailChao wrote:On the Lynx there is 64KB of RAM available, and 16KB of it is lost immediately to two framebuffers. More is lost if you are doing buffer feedback effects or rendering to textures. Your engine / drivers / textures all have to live in the remaining space and still have room left for stage data. While I can access the cartridge, it is extremely slow and byte by byte (it even requires seek times). But that is the only way new data can get in.
Exactly how slow is the Lynx's cart interface? Are we talking 1x CD-ROM slow, FDS slow, or C64 tape slow? Even the Game Boy Advance has seek time greater than sequential read time, yet it's still fast enough for XIP because the memory controller prefetches up to eight Thumb opcodes from ROM during idle bus cycles (mostly multiply instructions and the pipeline bubbles of load instructions).
OneCrudeDude wrote:If I may interject for a moment, wasn't Hello Kitty World an NES port of Balloon Kid that came out one year or so later?
Release dates are beside the point I was trying to make that practical level design on one platform turns into leaps of faith on another. Or are you trying to claim that this particular leap of faith was intentional?
User avatar
TailChao
Posts: 11
Joined: Sat Oct 20, 2012 9:44 am
Contact:

Re: The NES vs. its contemporary competition

Post by TailChao »

tepples wrote:Exactly how slow is the Lynx's cart interface? Are we talking 1x CD-ROM slow, FDS slow, or C64 tape slow? Even the Game Boy Advance has seek time greater than sequential read time, yet it's still fast enough for XIP because the memory controller prefetches up to eight Thumb opcodes from ROM during idle bus cycles (mostly multiply instructions and the pipeline bubbles of load instructions).
You get two registers for reading / writing to the cartridge. However, all they do is either latch or drive the state of the data bus while lowering one of two strobes. The address is determined by the concatenation a 74164 shift register and 744040 counter which are both controlled through some GPIO. The counter is used for the lower address bits (offset), while the shift register is used for the upper bits (block). That's where the seek time comes into play- if your data are aligned on a block boundary, you don't have to perform several reads just to increment the counter until it reaches your correct offset. But the shift register must still be loaded.

More detailed writeup available here courtesy of LX.NET.

Performing a 2MB checksum takes about two minutes, giving ~16KB/s ish. So yes, really slow and really inconvenient.
The Lynx was originally designed to use tapes as its game media, and this was the band-aid to allow it to use traditional cartridges.

Putting a microcontroller in the cartridge to allow address selection by just writing three bytes (after the boot phase anyway, which requires the above setup) would alleviate many of these issues. But that would not have happened back in the day.
User avatar
tokumaru
Posts: 12106
Joined: Sat Feb 12, 2005 9:43 pm
Location: Rio de Janeiro - Brazil

Re: The NES vs. its contemporary competition

Post by tokumaru »

Bregalad wrote:For example the player AI code is not dealing "directly" with hardware but you're still going to poll buttons press from the NES controller, so if you port it to another system a full rewrite will still be needed.
What? Your AI should be making decisions based on a few bytes that describe the state of the controllers (currently pressed keys and newly pressed keys, usually), so all you have to change is how those bytes are formed, which is often in an isolated routine. Reading the controllers directly in the game logic is a bad practice I'd expect from those old GBAGuy tutorials (I remember an old tutorial that would strobe the controllers and read the status several times until the button of interest was reached, right in the middle of the game logic, for each button... was that GBAGuy's?), so I really hope you're not doing that.
The same for code that deals with printing or displaying anything on screen, etc... You won't be directly by writing to $200x, but the way you prepare your buffer and screen alignment is still closely related with hardware.
I don't know about you, but my scrolling engine just spits arrays of tile indices and attributes. As long as the target machine is tile based, converting that data to another format would be fairly straightforward. But even if the hardware was very different, it must have a way of rendering rows and columns of blocks if you expect to port an NES scrolling game to it, so as long as you have isolated "DrawRow" and "DrawColumn" routines which are fed the coordinates of the row or column in the level map, you can modify only those routines to generate data in whatever format you need, even if you're not using name tables that are 32x30 tiles large.

The same goes for sprites... it shouldn't matter if sprites are not made of 8x8 tiles, as long as the same (or similar) visual effect can be achieved in another machine, it's just a matter of changing your "DrawSprite" routine to output data in the appropriate format. You might have to change the meta sprite data as well, depending on how different the new system is.

The point is that as long as the machines aren't severely restricted in resources (like the 2600 is), you can abstract most of the parts that deal directly with the hardware. The NES isn't exactly abundant with RAM and CPU time, but my game designs have always abstracted these details and that never caused a significant impact on performance.
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: The NES vs. its contemporary competition

Post by tepples »

TailChao wrote:More detailed writeup available here courtesy of LX.NET.
So... MMC1 from heck, or maybe an SSD. An e-book reader would probably use something like CHM (Windows help file) compression instead of Huffword. CHM uses LZ-family codec with a few pages in each independently compressed section, which needs several kilobytes of RAM but works fine with slow sequential ROM access. Huffword, on the other hand, needs little RAM but fast random access to the large static dictionary in ROM.
tokumaru wrote:my scrolling engine just spits arrays of tile indices and attributes. As long as the target machine is tile based, converting that data to another format would be fairly straightforward.
I think the point is that not a lot of other machines with the same instruction set were based on the paradigm of hardware sprites atop a scrollable grid of character cells with modifiable glyphs. Neo Geo was based on vertical strips of 16x16-pixel tiles. And many contemporary western computers had dumb frame buffer displays, many of which couldn't even scroll the screen in hardware. For example, in HGR mode on Apple II, scrolling the screen took four frames to copy about 8K of data, and horizontal scrolling in increments other than 14 pixels was even slower. Drawing sprites on one of those was a matter of software-compositing the sprites on top of the background tiles and then copying the whole thing to the screen, and Apple II's attribute clash was almost as bad as the Spectrum's (7x1 pixel units instead of 8x8), causing most games to use orange/blue throughout or green/magenta throughout, or just use black-and-white backgrounds. So you'd have to completely rethink a big scrolling game to make it fit the platform.
Post Reply