Difference between 6502 and 2A03 CPU core
Moderator: Moderators
Difference between 6502 and 2A03 CPU core
Although Quietust already made some explorations, I decided to do my own.
After detailed study of 2A03 circuit following results were obtained:
- No differences were found in the instruction decoder
- Flag D works as expected, it can be set or reset by CLD/SED instructions; it is used in the normal way during interrupt processing (saved on stack) and after execution of PHP/PLP, RTI instructions.
- Random logic, responsible for generating the two control lines DAA (decimal addition adjust) and DSA (decimal subtraction adjust) works normally.
The difference lies in the fact that the control lines DAA and DSA, which enable decimal correction, are disconnected from the circuit, by cutting 5 pieces of polysilicon (see picture). Polysilicon marked as purple, missing pieces marked as cyan.
As result decimal carry circuit and decimal-correction adders do not work.
Therefore, the embedded processor of 2A03 always considers add/sub operands as binary numbers, even if the D flag is set.
(clickable)
PSD source : http://breaknes.com/files/APU/core.zip [155 MB]
Podcast (russian) : http://youtu.be/Gmi1DgysGR0
6502 schematics : http://breaknes.com/files/6502/6502.jpg
After detailed study of 2A03 circuit following results were obtained:
- No differences were found in the instruction decoder
- Flag D works as expected, it can be set or reset by CLD/SED instructions; it is used in the normal way during interrupt processing (saved on stack) and after execution of PHP/PLP, RTI instructions.
- Random logic, responsible for generating the two control lines DAA (decimal addition adjust) and DSA (decimal subtraction adjust) works normally.
The difference lies in the fact that the control lines DAA and DSA, which enable decimal correction, are disconnected from the circuit, by cutting 5 pieces of polysilicon (see picture). Polysilicon marked as purple, missing pieces marked as cyan.
As result decimal carry circuit and decimal-correction adders do not work.
Therefore, the embedded processor of 2A03 always considers add/sub operands as binary numbers, even if the D flag is set.
(clickable)
PSD source : http://breaknes.com/files/APU/core.zip [155 MB]
Podcast (russian) : http://youtu.be/Gmi1DgysGR0
6502 schematics : http://breaknes.com/files/6502/6502.jpg
Re: Difference between 6502 and 2A03 CPU core
Are you saying that the circuit is there, it's just not connected? That's pretty interesting... They simply had to make the CPU different (for legal reasons?) so they just "broke" a certain feature, instead of not implementing it at all... so weird!
Re: Difference between 6502 and 2A03 CPU core
Yup, Nintendo "cracked" 6502 to avoid patent payments.
Here is patent : http://www.google.com/patents/US3991307
"Integrated circuit microprocessor with parallel binary adder having on-the correction to provide decimal results"
So they need to cut only decimal correction.
Here is patent : http://www.google.com/patents/US3991307
"Integrated circuit microprocessor with parallel binary adder having on-the correction to provide decimal results"
So they need to cut only decimal correction.
Re: Difference between 6502 and 2A03 CPU core
And remember that at the time, only patents covered microprocessors. Copyright-like exclusive rights in integrated circuit topographies don't apply to ICs first sold before about 1990. Perhaps this is why the Super NES's CPU includes an authentic second-source 65816 core.
Re: Difference between 6502 and 2A03 CPU core
This actually makes full sense. Placing transistors on a die is actually a difficult and complex and painful job. Today for digital circuits this can be automated, but back in the '80s I'm not sure it could. It makes sense they would use a working die and just remove a few connections instead of having to actually re-do a 6502 without the decimal mode.
Re: Difference between 6502 and 2A03 CPU core
Sneaky. So Commodore's engineers were correct:
Quoted from Bagnall's On the Edge book.[Commodore 64 programmer] Robert Russell investigated the NES, along with one of the original 6502 engineers, Will Mathis. “I remember we had the chip designer of the 6502,” recalls Russell. “He scraped the [NES] chip down to the die and took pictures.”
The excavation amazed Russell. “The Nintendo core processor was a 6502 designed with the patented technology scraped off,” says Russell. “We actually skimmed off the top of the chip inside of it to see what it was, and it was exactly a 6502. We looked at where we had the patents and they had gone in and deleted the circuitry where our patents were.”
Re: Difference between 6502 and 2A03 CPU core
So how useful would decimal mode have really been?
Here come the fortune cookies! Here come the fortune cookies! They're wearing paper hats!
Re: Difference between 6502 and 2A03 CPU core
It could have made scores and other stats easier to manage... Can't think of anything else.Dwedit wrote:So how useful would decimal mode have really been?
Since I learned assembly with the 2A03, I don't really miss the decimal mode. In games you might need an occasional BIN to DEC conversion, or addition and subtraction of decimal numbers, but those are things you can code routines for just once (or even use someone else's routines) and never think about this again.
Re: Difference between 6502 and 2A03 CPU core
These days a lot is automated but the result is far from optimal and still needs human intervention to fix the worst offenders - it just avoids most of the work. It still takes a lot of effort to get done, especially with the complexity of current chips.Bregalad wrote:This actually makes full sense. Placing transistors on a die is actually a difficult and complex and painful job. Today for digital circuits this can be automated, but back in the '80s I'm not sure it could. It makes sense they would use a working die and just remove a few connections instead of having to actually re-do a 6502 without the decimal mode.
Re: Difference between 6502 and 2A03 CPU core
But they still have to be fast enough. ARMv4 (e.g. ARM7TDMI) doesn't have decimal mode or hardware divide. Someone on the gbadev board used to complain that the sprintf() call to convert binary numbers to decimal to draw the status bar every frame ate up a substantial portion of the available CPU time. And if you're storing both the binary version for calculation and the decimal version for display, why not just operate on the decimal version? That's what a lot of Atari 2600 game programmers tended to do, I'm told.tokumaru wrote:In games you might need an occasional BIN to DEC conversion, or addition and subtraction of decimal numbers, but those are things you can code routines for just once (or even use someone else's routines) and never think about this again.
Re: Difference between 6502 and 2A03 CPU core
IMO, that's just bad programming then. Keep a x-digit RAM piece, null terminated, and have all points stored in a array where every digit is a byte. It's not that hard to fix, even in C.
Re: Difference between 6502 and 2A03 CPU core
Fans of decimal mode might have called that a waste of memory.3gengames wrote:Keep a x-digit RAM piece, null terminated, and have all points stored in a array where every digit is a byte.tepples wrote:if you're storing both the binary version for calculation and the decimal version for display, why not just operate on the decimal version?
- rainwarrior
- Posts: 8734
- Joined: Sun Jan 22, 2012 12:03 pm
- Location: Canada
- Contact:
Re: Difference between 6502 and 2A03 CPU core
I imagine the most effective use case for this is an accounting program where you are keeping track of a lot of numbers onscreen, and you want to keep the UI responsive.
Of course, there's the additional overhead when multiplying BCD, which might throw a wrench into that goal...
Anyhow, it's convenient to have. It's better than having to write extra software routines to do the same thing, but as has been pointed out, those aren't that hard to drop into your program anyway, so the benefit is pretty minimal. If the NES had it, it would have been used.
Of course, there's the additional overhead when multiplying BCD, which might throw a wrench into that goal...
Anyhow, it's convenient to have. It's better than having to write extra software routines to do the same thing, but as has been pointed out, those aren't that hard to drop into your program anyway, so the benefit is pretty minimal. If the NES had it, it would have been used.
Re: Difference between 6502 and 2A03 CPU core
Even after excising the decimal mode circuitry, what about the rest of it? Why didn't they have to pay royalties for using the "integrated circuit microprocessor" modules?org wrote:Yup, Nintendo "cracked" 6502 to avoid patent payments.
Here is patent : http://www.google.com/patents/US3991307
"Integrated circuit microprocessor with parallel binary adder having on-the correction to provide decimal results"
So they need to cut only decimal correction.
Re: Difference between 6502 and 2A03 CPU core
In 1983 there was no mopyright.
The Famicom was made in the early 1980s, when copyright-like exclusive rights in mask works didn't exist yet. Until then, integrated circuit layouts were seen as too "utilitarian" to qualify for ordinary copyright. But by the release of the Super Famicom, the Treaty on Intellectual Property in Respect of Integrated Circuits (IPIC) of 1989 had been signed. So Nintendo licensed the 65816 from WDC.
The Famicom was made in the early 1980s, when copyright-like exclusive rights in mask works didn't exist yet. Until then, integrated circuit layouts were seen as too "utilitarian" to qualify for ordinary copyright. But by the release of the Super Famicom, the Treaty on Intellectual Property in Respect of Integrated Circuits (IPIC) of 1989 had been signed. So Nintendo licensed the 65816 from WDC.