How does Loader expect GT1 data to be presented?

Using, learning, programming and modding the Gigatron and anything related.
Forum rules
Be nice. No drama.
Post Reply
mcbrown
Posts: 6
Joined: 12 Nov 2018, 11:41

How does Loader expect GT1 data to be presented?

Post by mcbrown »

Hi Gigatron community,

I'm looking forward to building my own Gigatron during the christmas holidays. Until then, I've started writing my own simulator/emulator. I do know that there's already a fully working javascript emulator by PhilThomas and another really complete one written in C++ using SDL by at67. Still, I'd like to get to understand the Gigatron better by writing yet another emulator by myself. I've chosen to implement it in Objective-C and for Mac OS X. So far, I've got the CPU, Display Output, Sound, Keyboard input and the Blinkenlights working in a more or less robust way. I'll commit my project to gigatron-rom/Contrib once I'm happy with the code quality.

I'm currently stuck at understanding how the loading of GT1 files in Loader works. I understand, that the "INPUT"-register is involved, but I can't find out when to update the register with the next byte from a GT1 file.

Sorry for bothering you with this question, but even after looking at the Javascript code and at the Arduino-Code in BabelFish, I couldn't get my head around the protocol used by the loader. Maybe you can summarize it in a couple of sentences? That would be great!

Thanks a lot and best regards,
Tobias
at67
Site Admin
Posts: 647
Joined: 14 May 2018, 08:29

Re: How does Loader expect GT1 data to be presented?

Post by at67 »

In an emulator the simple thing to do is just slam the code contained within the GT1 file into the emulators RAM array and then having it executed by stuffing the execute address into 0x0016-0x0017 and 0x001a-0x001b.

If you actually want to emulate the real loading process that "Loader.ino" creates, then you need to present the data to the IN register during the rising HSync handler on the correct scan lines, (either 1 byte at a time, or 1 bit at a time depending on how accurate you want your emulation to be, see below).

Within Contrib/at67/Loader.cpp are 3 functions that show one way of achieving this.

If you look at the three functions sendByte(), sendFrame() and upload() you can see that there are two simple state machines controlling the flow of data into the IN register. Upload is called within the rising HSync handler to establish the correct timing.

I toyed with the idea of using a thread to bash the IN register with the correct values at the correct times, but quickly realised it would have been extremely difficult to guarantee the correct timing, so I wrote the interleaved state machines instead and they work fine.

P.S. in your emulator there is absoloutely no need to fully emulate Loader.ino to this extent as nothing within the Gigatron eco-system depends on this loading architecture, you can fully get away with just stuffing GT1 file bytes directly into your emulator's RAM. But if you are doing this as a learning/research exercise then you may want to emulate the full bit bashing architecture of Loader.ino which sendByte(), sendFrame() and upload() do not.
at67
Site Admin
Posts: 647
Joined: 14 May 2018, 08:29

Re: How does Loader expect GT1 data to be presented?

Post by at67 »

Here's an excerpt of a conversation I had about this very subject many months ago, hopefully this sheds some more light on the process.

If you check in "loader.cpp" for Loader::sendFrame() and Loader::upload(), I bypassed the bit banging part of the uploading stage as that is really only required for hardware, (the only way to get data into hardware is through the input port, (IN), which requires a specific bit format presented on the rising edge of HSync every scan line). In an emulation all you need to do is satisfy the software requirements of the ROM, rather than worry about the hardware, the ROM software expects an 8 bit value to have been shifted into the IN register every 8 scan lines, (except for one 6 bit case). So in emulation you can do the following:

- Watch vgaY, (which the bare-bones emulator generates), and wait for -36; this value is the start of VBlank.
- Count 8 lines, (just do a comparison on vgaY for -36+8), and you know the loader expects the first byte, (which is actually called FirstByte); put FirstByte into the IN register, (which is just a variable under software emulation, i.e. check out gtemu.c).

- Make sure the IN register has this valid data during the rising edge of HSync, (the rising edge of HSync is simple to determine in emulation, once again check out gtemu.c).

- Wait 6 scan lines for the next byte, (MsgLength), so -36 + 8 + 6, and then once again stuff MsgLength into the IN register making sure it is valid during the rising edge of HSync.

-36 + 8 + 6 + 8 for the LowAddress byte, -36 + 8 + 6 + 8 + 8 for the HighAdress byte.

- All that is left is 60bytes of your payload; the original upload has to be split up into 60 byte packets, one packet per vertical frame with the remainder bytes, (upload size % 60), sent in the last frame.

Now, how you interleave these horizontal scanline IN register bytes into the gtemu.c loop is up to you, a separate thread would be easiest in terms of implementation, (just watch vgaY and follow the above), I didn't get around to using threads for this as I figured there could be timing issues on a taxed or single core system. But any modern day multi-core high frequency CPU should be able to cope with this timing 99.9% of the time.

I implemented two state machines, a high level message one controlling the low level byte banging one, (the two functions I named earlier); the code is more complicated than doing it in a thread as the operation of the state machines is interleaved within the gtemu.c loop, but it guarantees I meet the timing specs for uploading no matter how slow the emulator is running or how taxed the emulation CPU is.
User avatar
marcelk
Posts: 488
Joined: 13 May 2018, 08:26

Re: How does Loader expect GT1 data to be presented?

Post by marcelk »

mcbrown wrote: 21 Dec 2018, 20:07 Sorry for bothering you with this question, but even after looking at the Javascript code and at the Arduino-Code in BabelFish, I couldn't get my head around the protocol used by the loader. Maybe you can summarize it in a couple of sentences? That would be great!
  1. GT1 files are made out of segments of 1 to 256 payload bytes. See Docs/GT1-files.txt
  2. Each GT1 file segment comes with an address where these bytes must go in RAM
  3. BabelFish breaks up these GT1 file segments so they fit in the Loader's protocol smaller packets
  4. These Loader packets synchronise with the video signal
  5. A packet is transmitted starting from a new vSync pulse
  6. One bit gets transferred per hSync pulse (on the positive edges)
  7. The first byte is a command byte, and it is always 'L' now (for 'Load')
  8. The next 6 bits are the payload length (1..60)
  9. The next 2 bytes are the low and high RAM address
  10. The next 60 bytes are the payload bytes, possibly padded
  11. The last byte is a checksum over the bytes in the packet
  12. Loader shows a red pixel for receiving an invalid packet, and a green one for each good packet (sum == 0)
  13. The checksum is reset to 'g' when receiving an invalid packet. Otherwise it keeps summing across packets
  14. While summing, the 6-bit "byte" is added as a full byte, with 2 bits taken from the previous byte
  15. In emulation, you may have to take into account that the 74HC595 shift register has a 1-bit latency
So the maximum net transfer speed is almost 4KB per second. Details for the protocol can be found in the receiver (Apps/Loader.gcl) and in the sender (Utils/BabelFish/BabelFish.ino).

The time between start of the vertical pulse and the first bit is extremely tight and absolutely critical. This is the only reason why interrupts are disabled on the Arduino while doing this, and why we clock the ATtiny85 to 8 MHz.

Hope this helps.
mcbrown
Posts: 6
Joined: 12 Nov 2018, 11:41

Re: How does Loader expect GT1 data to be presented?

Post by mcbrown »

Thank you very much for your answers, at67 and marcelk! I'm certain they will help a lot!

For a start, I'll try putting the bytes directly in my emulator's RAM to find out if things work as expected. With the explanation of the GT1 format in this thread and in the documentation, it should be rather simple to parse the file and put it's payload into the correct address(es).
mcbrown
Posts: 6
Joined: 12 Nov 2018, 11:41

Re: How does Loader expect GT1 data to be presented?

Post by mcbrown »

Hi at67,

I've now got the approach of simply stuffing bytes into RAM working to some degree, but I've got a bug which I was not able to fix, yet. Let me try to explain what is happening:

- I let the emulator initialize the Gigatron (booting up the program selection, playing the starting sound, display the flashing arrow).
- After that, I'm pausing the emulation (at any point, I don't care what the CPU is currently executing)
- Then I'm reading the gt1 file into RAM, segmentwise with each segment at the correct start address as indicated in the segment header
- I'm putting the starting address into vPC and vLR
- and then let the emulator resume it's work.

Upon resume, the first vCPU command of the loaded code is skipped. vPC is incremented by 2 in any case, even if it's a three-byte-command. vAC is not changed at all. This leads to the following behaviour:

- If the program starts with a two-byte-command, some programs keep working more or less, others crash immediately.
- If the program starts with a three-byte-command, then the virtual CPU is out of sync and executes garbage.
- I can get any program to run correctly by inserting a dummy command at address 0x200 (e.g. ldi 0x00), set start_address to 0x200 although the real code only starts at address 0x202.

By debugging the RISC code that is being executed, I think I understood that

Code: Select all

ENTER:        02ff fc03  bra  .next2      ;vCPU interpreter
              0300 1517  ld   [$17],y
NEXT:         0301 8115  adda [$15]       ;Track elapsed ticks
              0302 e80b  blt  EXIT        ;Escape near time out
.next2:       0303 c215  st   [$15]
              0304 0116  ld   [$16]       ;Advance vPC
              0305 8002  adda $02
              0306 d216  st   [$16],x
              0307 0d00  ld   [y,x]       ;Fetch opcode
              0308 de00  st   [y,x++]
              0309 fe00  bra  ac          ;Dispatch
              030a 0d00  ld   [y,x]       ;Prefetch operand
will load the opcode of the next command after my modification of vPC and skip the current one. But I could be completely wrong, I still have to getting used to this pipelining thing.
Or maybe I forgot something else that I should modify in my emulator besides of vPC (0016-0017) and vLR (001a-001b) to start the loaded code properly? Do you happen to have any idea or things I could try out to narrow the issue further down?
Or maybe I should better stop investigating this approach and rather switch to bit-banging the GT1 file via IN-register into the loader? What do you think?

Best regards,
Tobias
User avatar
marcelk
Posts: 488
Joined: 13 May 2018, 08:26

Re: How does Loader expect GT1 data to be presented?

Post by marcelk »

Or maybe I forgot something else that I should modify in my emulator besides of vPC (0016-0017) and vLR (001a-001b) to start the loaded code properly?
Try to subtract 2 from the low byte you put in [$16] and [$1a]. That's what the Loader itself does as well:

Code: Select all

              8515 0129  ld   [$29]       ;Execute
              8516 a002  suba $02
              8517 c216  st   [$16]
              8518 c21a  st   [$1a]
              8519 012a  ld   [$2a]
              851a c217  st   [$17]
              851b c21b  st   [$1b]
The reason is as you indicate: vCPU increments vPC by 2 before fetching the instruction. [ In fact, putting this adjusted value in vLR is legacy, and not very useful. But it's what the Loader application does... ]

One more thing: when you pause the emulation, make sure you do this when you're out of vCPU. Otherwise there can be corruption by changing its virtual registers while they might be used. If the PC (not vPC) is in page 1 or 2, you know you're safe because that's the video/sound loop.
mcbrown
Posts: 6
Joined: 12 Nov 2018, 11:41

Re: How does Loader expect GT1 data to be presented?

Post by mcbrown »

Try to subtract 2 from the low byte you put in [$16] and [$1a].
That was it, hooray! The emulator is now working rather well. Thanks a lot for your help!

I'll add the emulator to the Contrib section once I'm happy with the codebase and the featureset.

Best regards and thank you again for your great support!
Post Reply