Gigatron ASIC and the Gametron Handheld

General project related announcements and discussions. Events, other retro systems, the forum itself...
bmwtcu
Posts: 145
Joined: 01 Nov 2018, 12:02

Gigatron ASIC and the Gametron Handheld

Post by bmwtcu »

UPDATE:-
I mentioned in this thread that my long term goal was to build a Gigatron handheld. I wasn't actually planning on doing it so soon, but when at67 revealed the SDCard Browser ROM on 1/22/2021, I realized that I had all the pieces to make it a reality and have been working the last few weekends to slap a proof of concept together.

This project is dedicated in memory of Marcel. None of this would have been possible without his sharing the Gigatron with the world. When he put out ROMv5a, Marcel said something that stuck in my head:-
"The Gigatron should be primarily about games and much less about becoming some kind of weird PC or an RPi/Arduino." - To that end, I present to the world the Gametron - the handheld Gigatron - AFAIK the world's only handheld Gigatron as of this date 2/15/2021.


THE PLAN/OBJECTIVES
POCplan.JPG
POCplan.JPG (76.49 KiB) Viewed 6773 times
Slap a bunch of dev boards together to see if this was even possible. I wanted to leverage the existing code base so that the handheld could benefit from any new developments on the software front, whether it be a new ROM or new developments with the Pluggy Reloaded - the design had to be 100% code compatible with the original Gigatron TTL design.
I needed to find out if interfacing directly to an LCD display was possible - It is!!!
I also needed to determine how much power the whole setup would consume and whether it was practical as a handheld - The Proof-of-Concept as currently setup consumes ~900mA@5V, which in my opinion is a little too much to be practical as a handheld. Better component selection should help bring that number down. I ran tests without the backlight on and have determined that the backlight alone consumes ~650mA@5V, so clearly there is a lot of room for improvement just from changing the display.

NEXT STEPS
* Learn how to strip down a production ROM so I don't have to keep bugging at67 to get a bespoke ROM every time he puts out a new release - instructions are found later in this thread.
* Find a smaller display. From both a physical proportion and power consumption standpoint, a production handheld would benefit from a smaller native 640x480 display. While smaller displays exist, the trade-off is often at best higher cost, or at worst, added complexity requiring changing the software VGA timing if I have to deviate from native 640x480 resolutions.
* Migrate the FPGA platform to a more modern/less power hungry/more affordable one. The Papilio Pro costs ~USD$75, and has more functionality than is needed in a handheld Gigatron. I can't build the Class D amp/Arduino/SD Card for what I can buy the dev boards for, but if I can integrate some of the functionality in the FPGA, it would help from a cost/real-estate/board complexity/power consumption standpoint. The Spartan 6 FPGA on the Papilio Pro is over a decade old and in my opinion obsolete. I am in the process of evaluating more modern options and have designed a breakout board with which to test a Gowin GW1N-9.
gowinbreakout.png
gowinbreakout.png (108.68 KiB) Viewed 6773 times
POCphase2.JPG
POCphase2.JPG (71.62 KiB) Viewed 6770 times
POCphase3.JPG
POCphase3.JPG (68.83 KiB) Viewed 6770 times
ACKNOWLEDGEMENTS
I'd like to thank:-
* Marcel and Walter for giving me something to obsess over as a pandemic project
* menloparkinnovation for releasing his SystemVerilog Gigatron implementation that I adapted to the Papilio Pro FPGA development board
* the Gigatron community for their contributions of applications that run on the platform, but I'm especially appreciative of the brainstorming sessions with at67 and norgate as I put this proof of concept together since I relied heavily on the SDCard Browser and Pluggy Reloaded

WHAT THIS MEANS FOR THE GIGATRON ASIC
Well, I'm certainly going to be spending more time on the handheld front, so it's unlikely that I will be able to make the schedules stipulated by Google, but free ASIC developments don't grow on trees, so I might still work on it a little bit. Sometimes I wish I didn't have a day job and mortgage to worry about...:P

ORIGINAL POST:-

https://www.youtube.com/watch?v=EczW2IWdnOM
https://docs.google.com/presentation/d/ ... EpMLwK/pub
https://groups.google.com/g/skywater-pd ... zBSayPQy4I
https://fossi-foundation.org/2020/06/17/fossi-dial-up
https://fossi-foundation.org/2020/06/30/skywater-pdk

Anyone interested in turning the Gigatron into an ASIC? We already have a DE10-nano FPGA implementation from menloparkinnovation (https://forum.gigatron.io/viewtopic.php?f=4&t=55) and I've been able to strip it down to a synthesizeable Gigatron core that should work with any FPGA architecture in conjunction with R2R DACs. What I lack is the know-how and patience to generate test vectors to prove it out fully for the Skywater PDK (https://github.com/google/skywater-pdk).

I would suggest:-
1) on-die 64Kx8 SRAM
2) clock divider circuitry that supports 25MHz %2 (12.5MHz) and %4 (6.25MHz) options
3) on-die 64Kx16 FRAM/flash/EEPROM loadable via JTAG???/RISC-V supervisor

This leaves 17 I/O (5V I/O is supported) for:-
- Video output (8)
- Audio output (4)
- LED ouput (4)
- Serial input (1)

[Copied from BigEd on AnyCPU forum - https://anycpu.org/forum/viewtopic.php?f=3&t=756]
Google is offering free chip fabrication runs
- up to 40 designs will be fabricated for free
- if they get more than 40 submissions they will choose which to make
- successful projects will get of the order of a hundred assembled (and tested?) parts
- the chips are 16 mm^2, with 10 mm^2 free for use and the rest a RISC-V supervisor
- there are 40 I/Os, it's a 130nm process, and it might even be 5V tolerant
- the project must be open source and on github
- the aim is to pipeclean an open source design flow
- the design flow isn't yet quite final
- the first chip run is in November 2020

I can post what I have on github if there is any interest. Thoughts?

PS - If there's enough room for two of the cores, we might even be able to implement the 2nd core with the 244 instead of the 595 as proposed by Marcel.
Image
Last edited by bmwtcu on 15 Feb 2021, 19:26, edited 4 times in total.
Serion
Posts: 6
Joined: 18 Jul 2020, 23:42

Re: Gigatron ASIC?!?

Post by Serion »

I saw that Google was doing this and thought it was totally awesome! Unfortunately, I can do little more than say this is an awesome idea since I have ZERO knowledge of programming an FPGA. (Although I do have a DEC-10 nano)
at67
Posts: 639
Joined: 14 May 2018, 08:29

Re: Gigatron ASIC?!?

Post by at67 »

This is a fantastic idea, I personally have no ASIC experience and my FPGA experience is starting to get rusty; but whatever I can help with I'm available to try and make this a reality.

All the extra features you mentioned sound great to me, the only thing I would add, (seeing as this is effectively an upgraded Gigatron), is an inbuilt USB interface that conforms to the current "Loader" mechanism that loads .gt1 files over the controller port. I have no idea of the best way of approaching this, but if you think it's a good idea I can try and come up with something.

An ASIC with an external Arduino hooked up to it just doesn't feel right, but a completely self contained ASIC that anyone can plug any USB device into, (that is configured correctly), and upload .gt1 files to the ASIC sounds magnificent.
bmwtcu
Posts: 145
Joined: 01 Nov 2018, 12:02

Re: Gigatron ASIC?!?

Post by bmwtcu »

So each of the chips comes with a RISC-V on board which is part of the standard harness that efabless would design, so I'm more confident in your ability to port Babelfish over to the RISC-V and do away with the Arduino all together than I am in my ability to slap on an AVR core from Opencores and have it work first time. Given the timeline for project submission by Nov (TBD), in order to have any chance of success, we should follow Marcel's philosophy of keeping the hardware simple and doing whatever is possible in software. I'm taking a stab at the work breakdown that I would have to run by Tim to verify understanding. I'm a board level digital designer, and have never done most of this before, so feel free to correct my understanding of what needs to happen. I welcome any feedback and volunteers. These are lofty goals even without a timeline and I have no doubt this can't be done alone. I figure if Google is paying for it, I've little to lose to learn something new.

BEFORE NOV 2020
Publish applicable subset of existing FPGA design to Github as baseline
Modify RTL to add clock divider/MUX for 25MHz %2 and %4
Tie RISC-V GPIO to core and MUX with serial data/HSYNC/VSYNC
Formal verification/Design review
Learn PDK tools
Generate OpenRAM 64kx8 SRAM macro - what area of die is this at 130nm node?
Generate ?OpenFlash? 64kx16 flash macro - same die size qn + does RISC-V already have access to full flash to program it?
Place/Route/Generate GDSII using OpenLANE tools
DRC with MAGIC
Close timing @12.5MHz
Publish pinout

AFTER NOV 2020
Design WLCSP library shape in Kicad 5
Design PCB in Kicad 5
Design review
Order ?4-layer? board and stencils from JLCPCB - any existing breakout study for chosen package?
Build proto
RISC-V Babelfish port (need details on peripherals available, but assuming USB 2.0 Host/Hub to read flash drive and eventually USB controller?)
Test/debug
Last edited by bmwtcu on 19 Jul 2020, 18:47, edited 2 times in total.
at67
Posts: 639
Joined: 14 May 2018, 08:29

Re: Gigatron ASIC?!?

Post by at67 »

bmwtcu wrote: 19 Jul 2020, 13:36 So each of the chips comes with a RISC-V on board which is part of the standard harness that efabless would design, so I'm more confident in your ability to port Babelfish over to the RISC-V and do away with the Arduino all together than I am in my ability to slap on an AVR core from Opencores and have it work first time.
That sounds like the most sensible option, as long as the RISC-V has access to all the pertinent signals, then we can add whatever comms functionality we like in software.

Have you thought about SRAM and FLASH arbitration? i.e. How do the RISC-V and Gigatron arbitrate over the memory resources?
- Dual ported?
- Intermediate FIFO access?
- Alternate clock access?
- Master access through the RISC-V itself?

I know that potentially this opens up a can of worms in terms of design complexity and I completely agree with your quote of Marcel's design philosophy, but multiple device to memory access could allow unlimited future updates and upgrades, not just for the firmware itself, but also for external entities that want to access the Gigatron's RAM/ROM without going through Loader/BabelFish. e.g.

- Debugging of both vCPU and Native becomes trivial if the RISC-V has access to the Gigatron's ROM and RAM whilst it is running, (in the same way my emulator controls the virtual Gigatron through it's RAM and ROM).
- External devices could blast data/code into the Gigatron's RAM at USB speeds whilst it is running, opening up a plethora of options like; chained applications, streaming resources, (video/audio, game data), etc.
bmwtcu wrote: 19 Jul 2020, 13:36 BEFORE NOV 2020
Publish applicable subset of existing FPGA design to Github as baseline
Modify RTL to add clock divider/MUX for 25MHz %2 and %4
Tie RISC-V GPIO to core and MUX with serial data
Formal verification/Design review
Learn PDK tools
Generate OpenRAM 64kx8 SRAM macro - what area of die is this at 130nm node?
Generate ?OpenFlash? 64kx16 flash macro - does RISC-V already have access to full flash to program it?
Place/Route/Generate GDSII using PDK tools
DRC with MAGIC
Close timing @12.5MHz
Publish pinout

AFTER NOV 2020
Design WLCSP library shape in Kicad 5
Design PCB in Kicad 5
Design review
Order ?4-layer? board and stencils from JLCPCB - any existing breakout study for chosen package?
Build proto
RISC-V Babelfish port (need details on peripherals available, but assuming USB 2.0 Host/Hub to read flash drive and eventually USB controller?)
Test/debug
That is a lot to digest :) So I'll go through each milestone and figure out what I can contribute. I also think you'll need to come up with a concrete functional spec of exactly what it is you want this Gigatron on steroids on a chip to do, (i.e. it's exact capabilities), quickly and set in stone. I can throw ideas at you, (and hopefully others can as well), but very quickly, (within the next week or 2 I would guess), it would need to be ratified and signed off.

Very exciting...
bmwtcu
Posts: 145
Joined: 01 Nov 2018, 12:02

Re: Gigatron ASIC?!?

Post by bmwtcu »

Have you thought about SRAM and FLASH arbitration? i.e. How do the RISC-V and Gigatron arbitrate over the memory resources?
Good question... It's not entirely clear to me if the RISC-V has on-die memory as well or will require an external NVM/RAM so I have some reading to do. Regardless, I was assuming that the Gigatron would have separate SRAM and NVM from the RISC-V, but that RISC-V would have read access to Gigatron memory map during JTAG debug on the RISC-V. The Youtube video made vague references to being able to use the RISC-V to monitor the design, but it's not clear what is actually possible. The Gigatron NVM should be accessible via the RISC-V for purposes of updating it with a new ROM version, but otherwise the RISC-V really only needs to read from it during debug. I don't think they have spelt out how the NVM will be supported, but I'm actually hoping for something like an FRAM so it's really no different from writing to/reading from a DPSRAM. Similarly, I was assuming that there would be RISC-V JTAG functionality that allows you to access both RISC-V and Gigatron SRAM but not simultaneously. Will that be sufficient? You can tell I've no experience using a debugger.
concrete functional spec of exactly what it is you want this Gigatron on steroids on a chip to do, (i.e. it's exact capabilities),
I don't consider this to be a Gigatron on steroids actually. I think on Day 0, we should be able to use JTAG or write code on the RISC-V to load the Gigatron NVM with ROM v5a and be able to use a direct connection to serial data input to support a Famicom controller. It will NOT ever be able to support the SPI expander since that requires exposing the SRAM interface, which we don't have enough pins for.
What's new is:-
1) use of a 25MHz oscillator. Truth be told I had difficulty finding a 6.25MHz crystal last time I looked, so this is probably moreso for price and availability given that every Ethernet design out there already uses 25MHz. It's trivial to add 2 FFs to implement the %2 and %4 required to support 6.25MHz on Day 0 and offer an easy path to 12.5MHz. As seen before, that results in a large gain in terms of processing capability that opens up more possibilities in future. I'd implement this with 2 MUXes that bypass the FFs so you could set 2 jumpers to get 25MHz/12.5MHz/6.25MHz(default with no jumpers).
2) default 64Kx8 SRAM would open up the software possibilities as evidenced by your own development efforts.
3) I feel that the value add is really the RISC-V that (assuming it has USB) should allow the RISC-V to handle a USB flash file system to load a gt1 file via Babelfish from a USB MSD and support HID peripherals like USB keyboard/controller. I feel that this negates the need for the SPI expander if you don't mind the speed, but I'm undecided if it's going to be too drastic a departure from the existing Gigatron. Regardless, on Day 0, the ASIC should work with a Pluggy Reloaded or Arduino just like a regular Gigatron. A MUX will default to the serial data input pin, and a jumper would configure the MUX to switch to RISC-V.
Very exciting...
Don't get too excited... It's all talk until ppl sign up to do actual work, and at this point the only thing I've signed up for is publishing Verilog code to a Github, at least until I figure out if the tools assume any familiarity that I lack :lol: While I'm game to learn, it's not clear to me if the open source tools really offer up any shortcuts to the ASIC development process and if I'm biting off way more than I can chew. That said, I feel that the Gigatron is simple enough that there is some chance for success.

I suspect this endeavor is only worthwhile from a "learning PDK/EDA tools" standpoint. The folks who enjoy the soldering aspect of the Gigatron aren't going to get much out of this, and really an FPGA implementation opens up many more opportunities for experimentation vs an ASIC. I don't think we've seen the last of the Gigatron hardware improvements, so really the benefit in the short-term will be having more than 1000 platforms for the software folks to showcase how they push the limits of the Gigatron design, at least until we obsolete the ASIC with further hardware improvements.
bmwtcu
Posts: 145
Joined: 01 Nov 2018, 12:02

Re: Gigatron ASIC?!?

Post by bmwtcu »

Hi,
Is there documentation on what end-user capability is available on the RISC-V supervisor?

Firstly, I'm interested in whether there is:-
1) JTAG debug interface that can map/modify end-user 64kx16 NVM.

The supervisor will load it's program from an external SPI flash. There will be some default programs and you'll be able to also provide your own.

2) USB Host capability
3) USB Hub on-chip

There is currently no USB support. I am interested in seeing people create working USB IP here which will lead to future harnesses having a USB interface.

4) need for external RAM/ROM for the RISC-V, and whether this chews up the ~40 I/O available.

The current plan is to have ~40 I/O pins for the user (with about 48 in total). The RISC-V harness will have internal RAM and load it via SPI meaning you only lose 4 pins, then you need power and GND pins.

You also have plenty of space to put your own SRAM in your area using OpenRAM.

5) RISC-V GPIO for bit-banging serial protocol to user design

The harness should have plenty of "internal" GPIO pins which can be connected to a user's design. Guidance around this will be released in the near future.

Secondly, is dual-port FRAM an option supported by the tools/process? It's not clear from the presentation what the NVM options definitively are. I'd like to figure out die area based on needing 64kx16 bits and am not sure where to start.

There will be an NVM build space released in the future but someone will need to develop a NVM compatible memory compiler which is able to turn the individual cells into a full blown memory block.

Hope that helps,

Tim 'mithro' Ansell
If this is truly going to be an iterative process with multiple opportunities to add USB at a later date, we could do w/o USB first time through, but no NVM seems like a show stopper. Maybe just have it be a 64kx16 SRAM that loads via SPI each power-up?
at67
Posts: 639
Joined: 14 May 2018, 08:29

Re: Gigatron ASIC?!?

Post by at67 »

bmwtcu wrote: 19 Jul 2020, 20:13 If this is truly going to be an iterative process with multiple opportunities to add USB at a later date, we could do w/o USB first time through, but no NVM seems like a show stopper. Maybe just have it be a 64kx16 SRAM that loads via SPI each power-up?
Coming up with an NVM compiler for this ASIC's development environment in the time allotted with everything else that needs to be accomplished sounds like a recipe for failure.

Using SRAM for the ROM and worrying about how we fill it at power up at a late stage sounds like the perfect compromise.

Something else to consider is the non standard effective pixel clocks that will be generated by a Gigatron running at any frequency other than 6.25MHz, i.e. even with Marcel's modified ROM3y.ROM the image generated will always have some type of nasty clock dependent aspect ratio. A solution to the variable clock problem and potentially to even high resolution and high colour depth video modes is to decouple the video output from the master clock.

Doing this in the firmware is basically impossible, (unless it was running so fast that we could run 10's of native code instructions per effective pixel output, currently it runs 1 instruction per pixel output). So my proposal is to decouple the video from the clock in the FPGA/ASIC with a FIFO, a couple of counters and some basic logic.

A 160x8 FIFO that intercepts the Native out instructions, (VSync and HSync can be used to decode this instruction), one scanline at a time at whatever speed that the Gigatron is able to generate them at. Then simple counter logic generates new VSync and HSync outputs at the normal effective 6.25Mhz pixel clock. The FIFO would need to be dual ported or use internal double buffering, (half full/half empty flags etc), to allow dual access from both the Gigatron, (write only), and the video logic, (read only).

The beauty of a system like this, is not only does it solve the variable main frequency to pixel clock coupling problem, but it allows the various scanline modes that are currently implemented in software to be implemented in hardware. e.g. The Gigatron writes only one of the four repeated scanlines to the FIFO and then the FPGA/ASIC logic repeats the scanline by just re-reading and outputting the FIFO 0 to 3 times, all without burdening the Gigatron whatsoever. In effect the Gigatron is permanently in video mode 3, providing the most efficient vCPU performance but also providing the various scanline modes without the associated vCPU performance hits.

If this new video logic was made accessible on the ASIC's external pins then it could be expanded trivially with external circuitry to add high resolution or high high colour depth, (paletted), modes as well. You could even add a basic palette into the FIFO/counter logic, say 6bits out of 12 bits; providing 64 out of 4096 colours; how you map the palette memory into the Gigatron's address space could be trivial and use any of a number of methods, e.g. Blinken Lights.

P.S. Before I mainly specialised in software I used to be a video hardware engineer, so even though I am rusty I have a fair amount of expertise in this area and what I am describing, (the basic components), would be trivial in an FPGA/ASIC; when I was bright eyed, full of dreams and untainted hopes I wanted to be a GPU engineer, but in the end I had to settle for being a shader monkey and rendering engineer :/
walter
Site Admin
Posts: 159
Joined: 13 May 2018, 08:00

Re: Gigatron ASIC?!?

Post by walter »

Interesting stuff.

I can imagine this is a project that will be a lot of fun to do and it would be very rewarding to have Gigatron ASICS. Is the goal just to learn about these systems and have fun while doing so, or do you see the resulting Gigatron ASIC as something others would be interested in? What would be the advantages over the TTL version?
bmwtcu
Posts: 145
Joined: 01 Nov 2018, 12:02

Re: Gigatron ASIC?!?

Post by bmwtcu »

Hi Walter, for me personally, it's a learning exercise. I have never done this before, so don't hold your breath if you're waiting for an ASIC kit :lol: . Given it's relative simplicity, I feel the Gigatron design affords me a good chance of actually getting something done in the short timeframe given I still have a day job. If I actually get to the point of selection and get chips back to test, that would be a bonus, but the WLCSP package makes it a challenge. At least in the first iteration, there would be no additional features by default over the TTL version. Cost and size would be the only practical advantages. Short term I'd try to see if it's feasible to get the Gigatron PCB down to 1 sq inch! Depending on core vs IO voltage requirements, it might be doable. Longer term, I'd have to look into whether I can find a suitable LCD display to make this a handheld! I'm waiting for more information at this point before slapping a VISIO together on how this all works. I welcome all ideas and critique, but would probably stay true to the original Gigatron architecture this first pass. Feature creep and short timelines don't mix well.

at67, I understand what you're saying, and it's probably something best prototyped on an FPGA and come up with a scheme to select where the syncs and output come from, but in this first iteration, I'm going to claim success if loading ROMv3y is successful with 12.5MHz. Hell, if I get blinkenlights after loading a ROM image from SPI to the 64kx16 SRAM, I would consider that a success! :D
Post Reply