What's new

Worklog Some PS2 Project

Joined
Apr 5, 2023
Messages
15
Likes
13
Update time!

Sorry, I completely forgot to post updates! (again…)

Ever since the last update I’ve been working on FPGA stuff to get the video output of my mainboard up and running and I guess I’m slowly getting there.

After the little experiments with my DPI LCD I know more or less what to expect when it comes to resolutions and video timings. The big blocker was the lack of memory to buffer at least one field of each frame. It made the bridge between the 2 clocking regions very tricky and at some point it stopped me from progressing with my design.

The next logical step was to look into the available RAM options. First, I wanted to try the DDR3 on my Artix 7 Development board, as it would provide plenty of memory at a sufficient speed.

Getting DDR3 working is not trivial, though. I figured that out the hard way after 2 evenings of fighting with vivado’s memory interface generator IP. That “experience” forced me to take a step back and to reconsider things.
At that point I knew already that I would like to use a cheaper FPGA for the final board and it made no sense to waste time with proprietary Xilinx IP, so instead I was looking into the RAM options with my FPGA of choice, the Trion T20. It didn’t take a lot of reading in the datasheet until I realized that the only package I can economically design a PCB for (0.8mm pitch) does not have a hard DDR3 controller & as I’m by far not experienced enough to write a soft DDR3 controller, that option was out of the game.
Instead, I chose a SDR SDRAM for the framebuffer, the IS42S32400F-6BL to be precise. Compared to DDR3 it mostly has drawbacks: it’s slower, it’s more expensive, SDR RAM is kinda outdated and on top of that it requires more pins. BUT it’s considerably easier to drive and it makes PCB layouting much simpler with less strict timing & impedance requirements.

You can imagine the next step…. Custom development board time!

View attachment 31269View attachment 31270
This one was like a 2-weekend project and it should include everything I need to continue with video processing. I tried to check all the boxes regarding compatibility with my existing PS2 mainboard while making it flexible enough to be reusable for future projects. The main features are:
  • Trion T20F256
  • IS42S32400F-6BL SDRAM + QSPI flash for the FPGA config
  • All power regulators included, only requires 5V to operate
  • 2 80 pin mezzanine connectors with as many IOs as I could fit -> great for custom shields
  • 1 bank is selectable between 1.8V and 3.3V (mainly for the GS video interface)
  • Status LEDs, 1 user button
  • Programmable via JTAG (QSPI flash too)
  • The BOM of the whole board (including PCB) is less than my old Artix 7 A35 FPGA (just the chip!!)
SDRAM length matching is a bit sloppy, KiCAD kinds sucks in that regard and I just could not be bothered to waste a lot of time there (especially with 56 signals…). So, all lengths are within ~7mm (ignoring the propagation speed of inner vs outer layers). Should be alright for the speeds I need, but for the next revision I should improve it.

Of course, I also designed a shield for interfacing with my PS2 and the LCD later & a little JTAG programming adapter PCB for the FTDI module:

View attachment 31271View attachment 31272

Assembly went smooth, during testing I had to solder three little bodges, though. The main issue was the RST. Short version: Efinix recommends to use a voltage supervisor with manual reset input to drive the reset pin of the FPGA at powerup. The RST signal is also required for JTAG and the application note says to connect all reset inputs to the manual reset input of the voltage supervisor. Doing that will delay all reset pulses from the debugger by ~200ms, which made JTAG programming impossible. I had to connect the JTAG RST to the FPGA directly and solder the voltage supervisor in parallel to handle startup, that fixed it.

With the hardware part done, I can focus on the video interface again. First step is to design a proper framebuffer as the heart of the scaler. That was done in the last 2 weeks over Christmas.

The Efinix SDRAM controller IP did not convince me and making things from scratch is fun, so I decided to design one myself as a learning exercise and to get some experience. I would not call the Efinity IDE and documentation beginner friendly, so it was a good opportunity to experiment with the tools too. There’s not a lot of features compared to vivado; no wizards, no templates, no simulator, no schematic view, no block designs; BUT it’s a million times faster, the entire flow from synthesis to bitstream takes just as long as the synthesis in vivado…I'm still using vivado for functional simulation, though.

Back to the SDRAM: The state machine was straight forward, but it was my first-time debugging timings. In the end I got it working at 150MHz, but at the max of 166MHz the controller just reads garbage still. 166MHz should be achievable according to the timing analysis, but maybe I just have to recalculate my timing constraints to find the culprit.

Anyway 150MHz should be plenty already, in the simulation the controller hits ~400Mbyte/s write and ~300Mbyte/s read MAX when there are no rows to open/close. The video stream should require about 160MByte/s in the worst case.

The controller is optimized for R/W speed over latency and I tried to make it as simple as possible without sacrificing too much of the bandwidth. There are no fancy functions like command scheduling (overlapping, canceling,...) or delaying refresh cycles, only one command at at a time. It does do 8-word read/write bursts, though - to maximize the bandwitdh (that's 32 bytes per command). On top of that it avoids auto-precharges and insteads keeps track of the open rows to minimize overhead. So far the controller is passing all my test cases in simulation (with some generic SDRAM memory model).

My goal is to make the SDRAM controller as stable as possible, as it is one of the core blocks of the scaler & maybe I can still shave off some clock cycles here and there to make it faster.
I will work on that the next week and then I will continue with the other blocks. Missing would be the video input and output blocks (kinda defined already from my experiments), the read/write scheduler for the framebuffer (probably the most complex block), the scaler and the block to write OSD info into the framebuffer. Still a LOT of work ahead!
GSModeSelector can switch from 480i to 480p. Maybe experiment with it?
 
Joined
Dec 25, 2022
Messages
42
Likes
408
Location
Landeck, Austria
GSModeSelector can switch from 480i to 480p. Maybe experiment with it?
I appreciate the suggestion, but I decided not to rely on GSM for the same reason I decided to make miniature memory cards instead of using on virtual memory cards - it's not a plug and play solution and needs settings adjustments per game to get things working (and PAL games in general are pretty hit or miss). I'm lazy, so ideally I would just like to plug in an sd card and start playing - like you would do on a normal PS2.

While I'm typing already, I also have an update to share! :D

I spent the last 6 weeks on implementing a first proof of concept for the scaler. After doing the final tests of my SDRAM controller I was thinking about an architecture that is simple, expandable & hopefully functional. I split the design into functional blocks and started implementing & simulating them individually, starting from the video input block. To give it a start, only the weave deinterlacer was implemented to find problems before I dive into scaling.

The result of the first implementation can be seen here:


The current proof of concept still has its flaws, I don't know how well they translate into the video. The most obvious issues I observed are:
  • Every now and then the video shows heavy combing artifacts and feels a bit choppy, to then return back to a normal state for a while. While I understand that those artifacts are somewhat normal for a weave deinterlacer, sometimes they are so bad that I think there is a bug somewhere. I have my suspicions on where the issue might be and I will need to do some debugging of the whole frame lock logic - it might be that the deinterlacer loses synchronization and a field of one frame ends up in the next frame (which would explain the choppiness and the nasty combing)
  • Depending on the scene, the image has quite some noise. It is especially noticable in areas where there are big color/contrast changes (like the text in the OPL menu) or in GTA when there are soft color gradients (very obvious in the second half of the video when I fly the jet). In still images, some pixels also like to flicker, most noticably red and green. Still not quite sure where I should start troubleshooting that, it could be caused by a lot of things, my ideas so far are:
    • dithering on the GS output
    • bad connection of one or more lines on either the input or output (actually found 2 already)
    • timing issues either while sampling the video input or inside the deinterlacer
    • driving the LCD incorrectly - I'm operating the LCD in OE-mode instead of HV-mode, which is very much undocumented in the LCD's datasheet (apart from 2 diagrams, one for cascade, one for dual gate - no idea what the difference might be)
    • This was fixed already. The LCD datasheet doesn't list the setup and hold times and at 25MHz I thought looking into the timings wouldn't be necessary at this point.Turns out playing with the pixel clock phase shift of the output signal fixed the issue. I discovered it when injecting custom video data into the input stream and comparing the input & output - they were identical (and the issue was still present with the injected video signal). For the video signal I chose columns of dark and light gray with fast transitions - the output was displaying the first 2 Pixels of every transition incorrectly. That meant the issue could only be due to driving the LCD incorrectly and most likely with clocking.
Other than that I'm quite happy with the first try! This is now the first time the system is truly "portable" and playable, with no wires going to it!
  • The image itself is generally very sharp and the frame lock logic I implemented should also make switching between NTSC and PAL easier
  • There is visible combing, but on such a small display it's not as bad as on a TV
  • Video delay is exactly one frame, this is because of the framebuffer implementation I ended up using -> while the current input frame is captured, the output block transmits the last frame, this leaves me a lot of freedom in the way I structure the state machines. In the framebuffer this is done utilizing the bank select of the SDRAM, it allows me to always have 2 rows open at the same time and eliminates the constant precharge commands to increase bandwidth.
  • Each output frame is always transmitted twice per input frame (input frame consists of the odd and even field), this leads to a 25Hz effective framerate (PAL) and 50 Hz in total
  • Without the scaler, PAL video doesn't fit the LCD yet, but I made it easy to adjust the image position
  • I measured the power consumption today and it's about 1.2W in total for the FPGA development board and the LCD at full brightness
IMG_20240218_225645910.jpg
Those 2 big issues I mentioned will need some debugging and that's the next step. I started already and the debugger IP core in the Efinity software helps a lot, but I might still need to hook up the proper logic analyzer to see what is going on. Maybe someone with video processing experience has some clues to reduce the troubleshooting time? I would certainly appreciate it!

Anyway, I'm still very much motivated to continue with the FPGA implementation and let's see where I end up! FPGAs are actually kinda fun and I can highly recommend to give one a try!
 
Last edited:

wisi

.
Joined
Feb 24, 2024
Messages
2
Likes
3
This is an amazing project! I am glad people like the PS2 even after so many years and the projects around it get more and more interesting!
I have been doing small software and hardware projects on the PS2 as a hobby, but completing them is commonly difficult. :D

PS2 models after SCPH-75000 (included) have the IOP replaced with a PowerPC CPU with a MIPS Auxiliary Processing Unit. It runs the DECKARD emulator from the BOOT ROM, which partially emulates the IOP, while the MIPS APU emulates some of the instructions execution. The emulator code is in the DECKARD folder (big endian) in the BOOT ROM, at the very top (highest address) of the BOOT ROM. Those BOOT ROMs have 3 /CS signals - MIPS BOOT ROM /CS2, DVD ROM /CS1 and a /CSPPC for the PowerPC. I don't know if /CS2 and /CSPPC access the same memory area or two separate areas, but the data in both is exactly the same. For some unknown reason /CSPPC exists at the connector on the page of the Dev9 controller and SPU in the SCPH-70000 SM (even though that is not a deckard model).

To generate the BOOT ROM /CS to the new replacement flash ROM, the /CS2 has to be and-ed with /CSPPC, that went to the original ROM chip. Though it seems apparent that because there are few external devices, and internal devices don't exit to the SSBUS, even having the BOOT ROM selected when /CS1 and /CS5 (CDVD DSP) are not selected works fine.

The SSBUS (IOP sub-system bus to the peripherals), has configuration registers which set the access timings, widths and some other properties of the peripheral channels. It is possible for example to make the DVD ROM use 8 bit bus width or slower/faster timings. Also, AFAIK the DVD ROM uses by default faster timings than the BOOT ROM and also the BOOT ROM initially uses even slower timings and after a certain point in the boot process they switch to faster ones, but I could be wrong about this.

Also there is an unused EXTR SSBUSC channel. It is used on the PS2 TOOL for PIF (interface to the PC side for debugging) communication. its lines (chip select, DMA, interrupt) are present at pads on some SCPH-790xx models, so they might be present at least on some revisions (if not on all) of the PS2-on-chip ASIC (but for the GS and CDVD subsystem). That channel can also be used to connect an external device if that is necessary (avoiding the need for using the DVD ROM, though it would have to be remapped somewhere with enough space.

It is really inspiring to read this thread! Thank you for working on this project! :)
 
Joined
Dec 25, 2022
Messages
42
Likes
408
Location
Landeck, Austria
This is an amazing project! I am glad people like the PS2 even after so many years and the projects around it get more and more interesting!
I have been doing small software and hardware projects on the PS2 as a hobby, but completing them is commonly difficult. :D
Thanks! The PS2 was my first console, so I will also always have a soft spot for it! I've been modding mine already as a kid 20 years ago, so it's really fun to do something more serious with it now :)
Also thanks for the infos regarding the CS, it's quite hard to find information like that online! I haven't seen /CSPPC yet, could it be that it's one of the pins 12, 13 or 14 on the bios chip? Anyway, handling the CS on the bios flash is quite flexible, we found a couple of ways already that seem to work. Only for getting the DVD ROM to function it's a bit more complicated so far, as it requires 16 bit data by default. In a portbale it's not needed anyway as there is no disk drive, but I could imagine physical games wouldn't work without it(?)

While I'm typing already, I forgot to write an update again....

Video Output

Soooo... on the FPGA side I have found and fixed the issues from the last update.

The noise in the image was actually just a timing issue when driving the screen, as seen in my last edit.
The artifacts and lag every now and then were caused by frame drops in GTA. This is something I didn't think about initially, but now it makes sense. For PAL, the video output is fixed at 50Hz - independent of the ingame framerate. For a 25fps game this means that a frame consisting of 2 fields is output at 25Hz.
Now to the issue: When the game framerate drops below 25fps and the video output is fixed at 25Hz, some fields/frames will need to be sent twice to keep the output stable (most likely applicable to FMV scenes too).
In my first deinterlacer implementation, my assumption was that a frame always starts when HSYNC is low directly after a rising edge of VSYNC. My algorithm would then stitch the following 2 fields together and display them. I think usually this assumption would be correct, but in the case of lag spikes, this cannot be guaranteed. In the worst case, the deinterlacer would then combine fields from 2 completely different frames, which causes the combing artifacts and the additional lag.

My first fix for this was to always stich 2 subsequent fields together, which also doubles the output framerate from 25Hz to 50Hz. It works OK for games that run at 50fps, but for <25fps games it always generates combing artifacts, because in 50% of the time 2 different frames are combined.

The next implementation was just a simple line double and that was the point where I discovered all this. The line double had no issues at all, because it was displaying 1:1 the input, just line doubled. It looks horrible, though....

A couple of iterations of various algorithms later I started feeling the need for a motion adaptive deinterlacer, as it would fix most of my issues with the weave deinterlacer and the quality issues of the line doubler. This is exactly what I did and here is a demonstration of my motion adaptive deinterlacer algorithm:


It's a hybrid between a 2-field and 3-field motion adaptive deinterlacer. Ideally it would have been a 5-field algorithm, but I don't have the memory bandwith to pull that off.
The performance is still pretty good, the 3-field part handles most of the motion detection and in the cases where it misses some motion (fast moving objects, mainly in 50fps games), the 2-field part kicks in and detects motion. When motion is detected, I'm doing a simple interpolation between the pixels above and below. When no motion is detected, weave deinterlacing is performed.
Another thing to mention is that the algorithm calculates every missing pixel of a field individually, compared to some implementations that group a bunch of pixels together.
The "calibration" is based on two thresholds, one that sets the allowed motion value and one that inhibits the noisy nature of the 2-field motion detector. I adjusted both to make the result look good in most situations, but it still needs some tweaking.

In some parts of the video I modified the calculations to display movement detected by the 3-field part in pink and the output of the 2-field detector yellow. It performs exceptionally well with 50fps games, apart from some minor combing present on very fast moving objects & some noise caused by false detections. In GTA SA (~25fps) it looks great too, but there (last quarter of the video) you can start to see the main weakness of the algorithm: during lag spikes, when a field is displayed multiple times, the motion detector cannot detect any motion, which leads to the flickering you see in some scenes. It's really only noticable when the interpolated pixels are colored pink, but it still bugs me a little bit. Proper motion adaptive deinterlacers have some sort of "decay" function that stores the motion values for each frame and gradually adjusts them according to the current motion value.
Mine doesn't do that right now, but I was thinking about how to best implement it & it might be possible to do it by rewriting parts of the framebuffer scheduler...

I also got two new displays from here and tested them; the display in the video is actually a new one. Overall they look OK - viewing angles are perfect, colors are ok, brightness is good - and they have somewhat decent datasheets. What really bothers me though, is the dithering they do. When first turning them on, the flickering caused by it is very distracting, but it seems to disappear when the displays are on for a while. Still not comparable to the outstanding image quality I got with the old waveshare display, but much better than some other displays I've used in the past.

There still a shitload of work ahead for the video processor:
  • look into implementing a "decay" for the motion values
  • write a (bilinear?) scaler (especially needed for PAL)
  • implement support for 480p and 480i
    • Everything I wrote so far should be natively compatible with NTSC, but I have no way right now to measure all the parameters I need to know in order to make the input state machine lock onto the video stream. For 480p I will need to change the fsm of the deinterlacer to just pass the data through for progressive video
  • the input state machine will need some refactoring, as the new screens are a bit picky when it comes to video dropout during boot or while launching a game. I will need to reset the video processor and the display when the fsm loses synchronisation to the input signal
  • I will still need a module that enables me to draw sprites onto the screen via the syscon (e.g. battery level)
I think with the motion adaptive deinterlacer it's looking pretty good already - that one caused a lot of sleepless nights...The next headache will for sure be the scaler...

Next topic:

To not lose momentum regarding the electrical/mechanical design, I started to work on the new mainboard revision with slightly reduced priority. I already copied the circuits from all my extension boards into the schematics (which was a big pain in the ass - libraries) and started working on a first housing concept to define the board outline and component placement. I would say the schematics are about 80% done and I'm sitting at exactly 30 pages right now. Cross checking this one will definitely take some time and I'm fully expecting this to not work first time, so I will be adding a lot of debugging/troubleshooting features to be sure. The mainboard will be about the size of a 5 inch LCD, most likely a bit bigger for mounting purposes.
Usually I don't share very preliminary stuff, but here is a very early concept for the mainboard, nothing is routed, the connectors are placeholders, and it will most likely still change a lot:

mainboard_2.PNG


I have a rough drawing of the housing front view, but sadly I cannot share it as solidworks refuses to launch since the last (forced) update on monday.

In short, this is the idea:
  • 3 or 4 piece construction; the mainboard, display, gamepad and batteries will be mounted in their own subassembly, call it "skeleton", which is sandwiched between the top and bottom housing. This allows for easy debugging, assembly and shell swaps.
  • The holes around the edges of the first mainboard concept are meant to have 2 purposes:
    • mount the mainboard to the display subassembly
    • securely attach and ground the heatspreader/shield I was experimenting with in one of my previous posts
  • The SD card slots will probably be on their own rigid/flex PCB, which attaches to the expansion connector in the top left
  • 2 18650 or 21700 cells; replaceable. The housing will probably have a cell on each side in the grips, which would make the grips ~30mm thick; the center would then be about ~16mm thick
    • 18560 vs. 21700: I did a quick runtime test to see which battery would satisfy my 3h runtime requirement. The test was done with 2 2500mAh 18650 cells from my junk pile (they are from ~2013, left fully discharged for a while) and I managed to reach about 2.5h in GTA SA at full display brighness. This makes me believe that 2 3500mAh 18650s would be sufficient, in case I cannot fit 21700 cells
  • For the buttons I'm planning to use PSVita buttons. I actually ordered and received some for testing. I'm considering these tact switches for the dpad and action buttons, after trying all variants I concluded that I prefer the 100gf variant. Start, select and power will most likely use something like this. The shoulder buttons and triggers are not decided yet
  • The analog sticks will for sure be hall effect switch sticks, probably in the XBOX arrangement. I got myself these and I quite like the look and feel compared to the Gulikit sticks
  • For cooling I'm planning to use a switch lite fan, controlled via PWM

Overall the progress slowed down a little due to the video processor implementation, but I'm having a lot of fun doing it :) It's good though to do some electrical/mechanical design again, I already started writing "if() then" accidentally in my C code recently....too much VHDL
 

wisi

.
Joined
Feb 24, 2024
Messages
2
Likes
3
/CSPPC which is also /CS13 in some way, seems to go to pin 12 of the BOOT ROM (SCPH-790xx).
Pins 13,14 seem to both be connected to A22, according to some measurements of mine from many years back (I could be wrong about this - I wasn't too sure about it back then too).

The DVD ROM contains contains the rom filesystems rom1, rom2, erom, and only the DVD Player(for playing DVD-Video discs) and the Chinese fonts (SCPH-50009) are on it, from what I could find out. You may want to ask about such questions on PS2 Scene discord or psx-place.com. The DVD Player is used in some disc exploits, but given that you don't have a CDVD drive, it should be mostly ok to remove (it shouldn't be needed by any games but best test at least a few or ask somebody who knows for sure).
 

thedrew

.
.
Joined
Sep 27, 2016
Messages
454
Likes
1,028
Insane man, really impressive and inspiring work as usual! That digital video is looking CRISPY.

I have those tact switches (100gf) ones installed in a couple of my portables and they are the perfect feel in my opinion, good choice.

Sounds like you're satisfied with the viewing angles of your screen but not the colors. That's the problem I've had with these small IPS panels UNTIL I found a couple small QLED panels, one 4.3" and one 5" with both having the same resolution of 800x480. Maybe I can suggest looking into the 5" QLED panel? Just know, once you go QLED, you can't go back... everything else just looks like trash lol.

https://www.elecrow.com/5-inch-qled-quantum-dot-display-800-x-480-resistive-touch-screen.html

I have that 5" QLED screen installed in my latest portable and it is gorgeous. There's a little bit of work involved since the the screen ships with a TN screen but an IPS LCD can be transferred in since all we care about is retaining the QLED filters. I go over it here as well as some example shots compared to a standard Waveshare IPS 4.3" screen if your interested in scrolling through this thread:

https://bitbuilt.net/forums/index.php?threads/priimiium-v1.5964/

The FPGA video + the QLED screen would be the end all be all for best display solution for the PS2, and who else would be the best candidate for this!?
 
Last edited:
Joined
Dec 25, 2022
Messages
42
Likes
408
Location
Landeck, Austria
/CSPPC which is also /CS13 in some way, seems to go to pin 12 of the BOOT ROM (SCPH-790xx).
Pins 13,14 seem to both be connected to A22, according to some measurements of mine from many years back (I could be wrong about this - I wasn't too sure about it back then too).
That would finally solve the mystery around pins 12, 13 and 14! One day someone should really test that bios mod with a disk drive, it's completely unknown to me what would happen in that case. With OPL it works great so far (for more than a year now) - and I've played a lot of games on it for far longer than I dare to admit :D
So for the next revision I will probably not change it, but I do have quite some questions - it's just a shame that such information is not easily available and will probably be lost with time.

Insane man, really impressive and inspiring work as usual! That digital video is looking CRISPY.

I have those tact switches (100gf) ones installed in a couple of my portables and they are the perfect feel in my opinion, good choice.

Sounds like you're satisfied with the viewing angles of your screen but not the colors. That's the problem I've had with these small IPS panels UNTIL I found a couple small QLED panels, one 4.3" and one 5" with both having the same resolution of 800x480. Maybe I can suggest looking into the 5" QLED panel? Just know, once you go QLED, you can't go back... everything else just looks like trash lol.

https://www.elecrow.com/5-inch-qled-quantum-dot-display-800-x-480-resistive-touch-screen.html

I have that 5" QLED screen installed in my latest portable and it is gorgeous. There's a little bit of work involved since the the screen ships with a TN screen but an IPS LCD can be transferred in since all we care about is retaining the QLED filters. I go over it here as well as some example shots compared to a standard Waveshare IPS 4.3" screen if your interested in scrolling through this thread:

https://bitbuilt.net/forums/index.php?threads/priimiium-v1.5964/

The FPGA video + the QLED screen would be the end all be all for best display solution for the PS2, and who else would be the best candidate for this!?
Thanks for the suggestion!
I would be surprised if the QLED backlight & filters can fix the horrible dithering of these LCDs and the mod doesn't sound like a very economical solution at this point - but I'm willing to give it a shot!
If you can confirm that this is the same model, I will order one. Anyway I orderd 2 LCDs, so I can sacrifice one for science :D
I may also order another one of those waveshare 5" DPI LCDs, as the image quality of the one I have is much better than the new screens. However, it's a couple years old and I don't know whether they use different panels now. Sadly mine has a broken row of pixels after a little accident...
 

thedrew

.
.
Joined
Sep 27, 2016
Messages
454
Likes
1,028
Yes you're right, the AliExpress link you posted is the same seller of the link I posted, just on AliExpress. Also won't solve the dithering issue you mentioned but the colors would be popping as well as a slight increase in contrast!
 
Joined
Apr 12, 2017
Messages
221
Likes
147
Location
Canada
Lord have mercy!! To think we used to consider the PS2 such a black box, and then you come along and just blow all it's secrets wide open! One wonders how one man can posses such keen insight.

Do these BIOS mods you've been working on have the ability to give a 70000 HD boot capability? Someone put a 50000 BIOS on a 70000, and despite a few bugs, it apparently gained the ability to use FHDB. Just wondering, since you would seem to be exactly the right person to ask!

Anyway, I am VERY impressed with your direct LCD interface. I always wondered if that was possible. Which is to say nothing of all the other amazing things you've already done! This is easily the most interesting thread I've ever seen. Awaiting eagerly your every word.
 
Joined
Dec 25, 2022
Messages
42
Likes
408
Location
Landeck, Austria
Thanks I guess! And sorry for the late reply, I was quite busy with work the last couple of weeks!

Regarding your question, I am pretty confident that the hardware-side of the bios mod would work fine on all models with the custom 1-chip OTP rom. The software side could be another story, but if someone got a 50k bios running on a 70k console it sounds feasible at least. I know from experience that it can be hit or miss, for example we tried a lot of bios revisions on my mainboard and older versions from 75k or 77k consoles wouldn't boot at all, if I remember correctly. For details about the bios itself, @Epaminondas is the expert, I think he spent a lot of time modifying it.

It's also time for a small update!

I didn't have too much time recently, but I managed to make some progress on the FPGA and mcad/ecad. Let's start with the FPGA:

In order to prepare for the automatic resolution detection I did quite some refactoring on all modules and also implemented a rough first version of the scaler module. The scaler doesn't do any proper scaling right now, but the datapath is there and the state machine already includes all required states to implement it easily. So far it can only do nearest neighbor downscaling in the vertical axis to make my PAL signal fit the 480p screen. I figured nearest neighbor scaling would look better than bilinear for scaling 512p to 480p, as the difference is quite small (for everything else my goto would still be bilinear, of course). To pull it off, I'm essentially scapping every 16th line of pixels for PAL. After playing for a bit it's almost unnoticable in games, but the text in OPL looks weird in some places. NTSC has no scaling applied right now, as it fits the screen natively.

Below is a video showing how seamless the resolution switching is:


Switching between 50 and 60Hz was actually the most difficult part, adapting to the new resolution just worked right away (as everything was prepared for it). For 60Hz I had to adjust the LCD timings, as the old parameters were only able to achieve ~57Hz. Now it works quite well, the video input module counts the vertical resolution for every field and schedules a reset in case the resolution of the current field is different from the previous field. The whole scaler then resets for the duration of one field and takes another field to resynchronize with the video input. So overall, I'm expecting about 3 fields of delay when the input resolution changes (~60ms).

I was also looking into getting 480p working, but here we have a little roadblock: the console "crashes" when switching to 480p (or any resolution apart from 448i, 512i, 480i, 576i). It would just try to communicate with the video dac forever and get stuck at whatever it is doing (very obvious when looking at the I2C to the video dac). If I remember correctly, Epaminondas had the same issue after removing the original video dac and he fixed it by connecting a STM32 as I2C slave. Apparently the PS2 is happy already when it gets an acknowledge for everything it sends.
This means I will have to implement an I2C slave in VHDL with slave address 0x21 (that's the address of the video dac) and do some experiments to get the progressive output working. Anyway the FPGA is connected to this I2C already (almost as if I was expecting issues...)
Overall, for 448i and 512i the video processor works great already, I even managed to play through most of rayman 3 using the NTSC mode (I know, I should work more and play less...). The datapath is now mostly complete with only the sprite module missing for displaying status information.

Last week, during my after-work beer, I was also thinking about the deinterlacer again and I might have a plan now to improve it further. With some changes to the function that packs/unpacks the pixel data into 8-word r/w bursts for the SDRAM, it should be possible to increase the effective framebuffer bandwidth by about 20%. Then I realized that I have enough space to store 2 full frames per bank, which means I could implement a 4 or 5 field deinterlacer, supported by the higher bandwidth (no precharges). With that I could ditch the error-prone 2+3 field deinterlacer and get a much better motion detection with about the same complexity. As the current solution already works quite well though, this will be something to implement in the future (after most other modules are working)

So enough boring FPGA stuff, I was working on the housing design too, to finally define good mainboard outline and component positions.

A couple of concepts, shapes and iterations later, I am at a point where I have a preferred design, apart from a couple of rough corners. Usually I take the easy route for my hobby stuff, but as I already spent so much time and effort on the electronics I decided to do it properly this time and try some new approaches. Even though I designed a couple of injection molded housings already I wouldn't call myself a good mechanical designer, especially when it comes to organic surfaces/surface modeling. Which means surface modeling it is! (aka. torture time)
It was clear from the very beginning that I want the housing to be as comfortable as possible, this is why I in the end tried something I've never done before... After estimating the size and thickness I continued by modeling a housing concept in clay (or at least tried :D), to perfectly fit my hands while achieving the design goals. Recently I had the chance to give the Ayaneo Air a try and I quite liked the ergonimics, so that was a big inspiration for the overall shape and size.

IMG_20240330_192442884-1.jpg
IMG_20240330_192521565-1.jpg


Afterwards I took pictures of it and imported them in solidworks with the correct scale. Then it was just a matter of time to draw all the surfaces, stitch them together and form a solid body to work with. After a couple more iterations and changes I ended up with the design below. It might not be the most beautiful or creative design, but the first test print is quite comfortable to hold and surprisingly small!


housing 1.PNG
housing 2.PNG

housing 3.PNG
housing 4.PNG


housing 5.PNG
IMG_20240430_214007743-1.jpg


Right now I'm roughly defining the inner subassembly, triggers, shoulder buttons, battery holders and cooling. The connector placement is quite defined already, the SD card slots will be on the upper side next to the cooling slots. The headphone jack, USB A and USB C will be on the lower edge (with the USB C centered). The speakers are still a bit open, I'm thinking about having them at the edge below the screen to keep them as far away as possible from the hall sticks (just to avoid nasty surprises). Cooling will probably be centered too, blowing the air out on the top... You see, regarding the physical placement of features I decided to go the corporate way this time, favoring housing aesthetics over (electrical) routing simplicity. Will not make my ecad life easier, but I think it's worth it.


One last thing I just remembered is the QLED screen I received!
Based on the info @thedrew gave in some of his posts, I took it apart and swapped the LCD. That was actually harder than expected! First I only swapped the LCD itself, but the IPS LCD is a bit too small for the plastic frame inside the QLED screen. I tried it anyway and the screen still had very weird viewing angles + it was impossible to center the LCD.
In the end I just swapped out the LED backlight in the IPS for the blue LED "strip" of the QLED and added the quantum dot film in front of the diffuser. So I scrapped the whole QLED including its polarizers, except for the LEDs and the quantum dot film. One mechanical issue I had was that the thick diffuser wouldn't fit anymore when adding the blue backlight, as the LEDs were placed differently. I had to cut off the thin lip on the top edge of the plastic frame with the scalpel to make room again for the diffuser, after that it was possible to assemble the screen again. First the backlight was a bit dark, so I increased the max. backlight current by ~5mA (very conservative, as I don't have a datasheet for the backlight) after which it was comparable again to the IPS.

Overall the modded display looks quite good, there is definitely a difference regarding the colors. I found that it works great for "cartoonish" games like ratchet&clank and rayman, but in some other games the colors do look a bit unnatural from time to time. To conclude: On one hand I will definitely keep using it (as it looks nice), but on the other hand I don't think the improved colors justify spending 80 bucks and an hour of work to modify the display. Of course, this is very subjective!

Sadly I forgot to take pictures of the process!
 
Last edited:
Joined
Apr 12, 2017
Messages
221
Likes
147
Location
Canada
Your case design looks just fine dude! It's worth while to remember that there are only so many ways to design a portable, especially with how many commercially designed devices you can get now. Frankly I think the most minimalist and functional devices look the best. There's something very pleasing about the symmetrical straight lines and uniform curves of say, a Nintendo Switch. That being said, your idea for the trigger button caps being tied into the top plate looks great.

Highly informative tip concerning the use of a "dummy" decive on the video DAC I2C line though, with so many of us planning to use alternate chips now, that's one point of failure we can eliminate! What do you suppose is the smallest standalone solution? A pic16 can communicate over I2C..

Anyway, one thing I've been planning on doing is using wide-screen cheat codes through OPL to change the screen resolution of games that are supported (of which there are a lot). One thing though is they change the resolution to 16:9 which in NTSC is 480×854 (I know you're using PAL but for this example it doesn't make any odds), whereas these screens we use are 480×800. Think you that these wide-screen cheat codes could be customized for these 15:9 screens we use? Either way the games being printed at a wider resolution in the parallel rgb888 data would definitely make them look less distorted. Imagine if we could force the games to be 1:1 for ever pixel on the screen...
 

thedrew

.
.
Joined
Sep 27, 2016
Messages
454
Likes
1,028
Awesome work!!! Yeah I agree, bit of a pain to swap the QLED backlight to an IPS screen. I think the easier option would probably be to 3D print a spacer to compensate for the size difference of the panels. But it is pretty expensive to have to buy a QLED screen AND an IPS screen. Loving this project, it just keeps getting better and better! Would the FPGA display driver be open sourced at some point?
 

StonedEdge

a.k.a. ClonedEdge
.
.
Joined
Nov 16, 2018
Messages
384
Likes
1,551
Location
Japan, Tokyo
Portables
2
"Small update..." - your updates are generally bigger than most of us on these forums lol :D

Case design looks fantastic. Definitely looks similar to the Aya Neo Air (proud owner of one of those) - its not as ergonomic as you would hope but its tough with a small design to make anything really comfortable... my pinkies don't quite comfortably go anywhere, maybe due to the small size and me having big hands.

Can't wait to see everything assembled!
 
Joined
Dec 25, 2022
Messages
42
Likes
408
Location
Landeck, Austria
Highly informative tip concerning the use of a "dummy" decive on the video DAC I2C line though, with so many of us planning to use alternate chips now, that's one point of failure we can eliminate! What do you suppose is the smallest standalone solution? A pic16 can communicate over I2C..

Anyway, one thing I've been planning on doing is using wide-screen cheat codes through OPL to change the screen resolution of games that are supported (of which there are a lot). One thing though is they change the resolution to 16:9 which in NTSC is 480×854 (I know you're using PAL but for this example it doesn't make any odds), whereas these screens we use are 480×800. Think you that these wide-screen cheat codes could be customized for these 15:9 screens we use? Either way the games being printed at a wider resolution in the parallel rgb888 data would definitely make them look less distorted. Imagine if we could force the games to be 1:1 for ever pixel on the screen...
Anything that is capable of I2C and has a 3.3V logic level should do the job. I can also confirm now that this actually works! Yesterday evening I wrote a simple I2C slave in VHDL to test it and now it doesn't crash anymore when changing to progressive! The module would be capable of storing the writes and returning data to the PS2, but it's not neccessary at all. Right now it just acknowledges its address & any writes to the slave and this seems to be enough for the console to be happy. Finally I can continue with implementing the progressive mode in the deinterlacer....

Regarding your other question I don't really have an answer. I was looking into this cheat code topic a little bit last year, but it seemed like a real pain to do with PAL games, so I didn't really investigate further

Would the FPGA display driver be open sourced at some point?
In the last couple of months I was actually considering to put my whole project folder on github for version control, then it would always be available for everyone. But I've never really worked with git, so before I mess it up I will have to learn how to properly make use of it... which is way less motivating than working on the project - and that's why today I still keep telling myself to learn git :D
So yes, the plan is to make everything open source!

"Small update..." - your updates are generally bigger than most of us on these forums lol :D

Case design looks fantastic. Definitely looks similar to the Aya Neo Air (proud owner of one of those) - its not as ergonomic as you would hope but its tough with a small design to make anything really comfortable... my pinkies don't quite comfortably go anywhere, maybe due to the small size and me having big hands.

Can't wait to see everything assembled!
Thanks! The grips would have been my only complaint too, I found them way too small for comfortably holding the unit! Making them much bigger really helped with the ergonomics. The pinkie issue is still a problem though, but it's probably hard to fix for such a small device. Maybe I'll experiment with extending them on the bottom, might even make space for 21700 cells!

EDIT: 448p and 480p inputs are now also implemented! The most difficult thing was to measure the video timings to enter in the video input module. The timings for 448p and 480p are:
  • pixel clock: 54MHz
  • vsync: 59.94Hz; low duration = 6hsync
  • hsync: 525 hsync/frame; low duration = 128clk
  • pixels: 1716 clock cycles per line, which implies that the pixels are output at 27MHz -> 858 pixels per line in total (subtracting hsync it's about 794 visible pixels max.)
My deinterlacer needed two additional states to add support for progressive. The implementation is more like a hack right now, as the progressive mode is using parts of the logic from the deinterlacer, which makes it quite slow. Supported resolutions are right now: 512i, 448i, 576i, 480i, 448p, 480p and switching between them still works perfectly! The video position is still all over the place for the different resolutions, but that will be handled by the scaler module in the future.
Next I will try to implement progressive properly (now that I know it works quite well), that should shave off a lot of overhead.
 
Last edited:
Joined
Dec 25, 2022
Messages
42
Likes
408
Location
Landeck, Austria
Long time no update!
I've been busy refining the housing concept and didn't want to post anything until there was some notable progess to show. Now I guess is the point...

Maybe first of all let's get the FPGA topic out of the way.
Defining all the details of the mechanical design is very time-consuming, which took a lot of my resources away from the video processor (which itself takes a ot of time!). As I'm confident that the video processor will work from what I did so far, I decided to prioritize the next hardware revision now and then later optimize the FPGA part together with the syscon code (also because they need to interface with each other for the final implementation). Since the last update I did implement pogressive mode, so there are now quite some video modes supported (marked in green below).

The biggest issue I'm facing is the fact that counting the vertical resolution is not a good way to find out the video input resolution. It's because some PS2 video modes have identical timings, but with different active areas (448i->480i, 512i->576i for example). This makes automatic resolution switching very difficult and I couldn't find a good way yet to manage it without some manual interaction. My first idea was to extract the video resolution information from the I2C to the video encoder, but after recording all video timings OPL supports I was rather disappointed to see that it's completely useless. Here are the results, maybe they are at least helpful to someone:
video_timings.PNG


Interesting observations:
- There are some cases where the resolution can be identified by observing the values sent to registers 90h and 91h (see 480p, 720p, 1080i)
- GCONT toggles when changing the video output mode in the system config.
- There is no I2C communication when switching from PAL to NTSC, this is only handled using the NTPAL pin
- The NTPAL pin is 1 when the console outputs progressive PAL (would expect the opposite)
- At boot there is not a lot going on, the first 2 bytes might be some device id and the bytes written to B1h, 33h and 32h are just 00.
Note: the same registers are accessed repeatedly in different sequences (but same values) until booting is done

Next topic: housing!
The last comment from @StonedEdge got me thinking about the "pinkie issue" everytime I touched the first mockup I made (could not unsee that post :D). Got some more feedback from my "test subjects" and ultimately decided to improve upon the existing concept!
This is the current state:
housing concept 3_1.PNG
housing concept 3_4.PNG

housing concept 3_3.PNG
housing concept 3_2.PNG

IMG_20240610_205348193.jpg


It's definitely bigger than the old concept, but I think I still managed to achieve a good compromise between dimensions and ergonomics. It took way more iterations than I dare to admit, mad respects at this point to anyone designing gamepads commercially! This is a seriously difficult task, maybe even the most difficult part of the project yet. The biggest headache was that there is no straight surface apart from the front & the grips are angled in 2 dimensions. I have to say that the clay method did make it a lot easier to modify the design for each iteration, but having access to a 3D scanner would have been really sweet here!

As I said this is the current state, but everyone I asked so far agreed that it feels good, so if there is no major issue with it I tend to not change the outer design too much (might still add or remove some minor features). The inner design is also pretty defined now: I know how many parts there will be, roughly how they look, how they connect and how the device will be assembled. More about that when the time has come...
Right now working on designing rough drafts for every component to check the fitment before continuing to refine everything. The bumpers feel nice too after 2 iterations and will use the KSC222J LFS tact switches, which I found have a nice feeback while still being a bit squishy. The triggers are still open. I was leaning towards the same switches, but couldn't find a configuration that feels "right" yet. @Lightning recommended some rubber membrane switches which I will try and maybe then the triggers are acceptable too.
I was also working on the battery compartment, which will most likely use Keystone 209 battery contacts paired with 21700 cells (yes, in the new concept they finally fit!). As with the rest, the design is tricky due to how the grips are angled, but I'm having fun :D
Depending on the budget, I may or may not need to run some wires for the batteries (or have a full flex PCB ready), but my focus is on avoiding soldering as much as possible in the assembly of the unit. Here is a quick little (preliminary) explosion view of the existing draft components:

housing concept 3_5.PNG


Next would be the mainboard
At 121x83mm the mainboard is certainly not as small as I would have liked it to be, but keep in mind this plugs directly into the 5" LCD and is essentially the same size. The only additional components needed to get it running would be the left and right button PCBs & a battery, that's it. The SD memory card slots will be on a separate flex PCB and sit next to the heatsink on top of the mainboard.
Component selection & placement are mostly done, I will use mid-mount sockets for audio, USB A and USB C to limit the overall thickness. For the USB A this was actually tricky, because there was literally one option available which satisfies my requirements of it being mid-mount and having no lip around the outer edge. Does this ring some bells? Yes, the stock mainboard actually has just the right USB A socket, so I will reuse it.
Right now I'm working on doing a new routing iteration to see whether all connections and placements would still be feasible on 6 layers (starting with the BGA fanouts):

mainboard_3.PNG
mainboard_4.PNG


Still open are the speakers. The placement is defined and they will be front facing , for that I was considering using either my current ones or some switch oled speakers, but I couldn't find the dimensions anywhere to check the fitment in my assembly. The switch oled speakers are ordered, but I'm still having issues finding the part number for the connectors they use. Looking at the pictures I would say it's a JST BM02B-SURS style connector, but I would need to order some to check. Maybe someone knows?

Another thing to note about the mainboard: There will still be no bios flash directly on the board, I will use a similar module to my old design. The reason is that I still see a lot of potential in bios customizations and I would also like to install the PS2 SDK to experiment with a custom version of PS2BBL at some point. I will also expose all signals that run directly from the mechacon to the EE on the bios connector to maybe do some feasibility regarding removing the mechacon and DSP. @Epaminondas already proved this is possible, so it would be quite interesting to try (certainly not for this portable, but maybe rev. 2.0?).

Technically not part of the project, but related:
For the first mainboard revision I had problems with soldering the EE, as it's quite large and when heated unevenly the substrate tends to warp. I suspect this is why the first try didn't work, but on my current mainboard I saw that the EE substrate is warped too (the corners are closer to the PCB than the centers). In theory this could be ok, as the silicon is wire bonded to the substrate, which might allow for some more stress (took one apart to check...). Anyway, to make the assembly more reproducible in the future I was slowly designing a reflow oven in parallel to the project over the last year and I'm pushing hard to make it ready for the next mainboard revision!
It's based on the Controleo 3 and fully custom designed to fit a PS2 mainboard (+ a little bit extra). I have no experience with reflow ovens, apart from what I hear from our suppliers at work, so this could be fun to set up and work with. In a week or 2 I should get the last components and be able to start it up the first time to gain some experience with it!

IMG_20240602_210737361.jpg


To sum it up: slow progress, but I'm getting there! It used to be faster in the past when I could fully focus on the electronics, but now there are a lot of components that need attention!
 

Y2K

"The PS1 Guy"
Staff member
.
.
Joined
Apr 14, 2022
Messages
181
Likes
400
Location
Chicago, IL
Long time no update!
I've been busy refining the housing concept and didn't want to post anything until there was some notable progess to show. Now I guess is the point...

Maybe first of all let's get the FPGA topic out of the way.
Defining all the details of the mechanical design is very time-consuming, which took a lot of my resources away from the video processor (which itself takes a ot of time!). As I'm confident that the video processor will work from what I did so far, I decided to prioritize the next hardware revision now and then later optimize the FPGA part together with the syscon code (also because they need to interface with each other for the final implementation). Since the last update I did implement pogressive mode, so there are now quite some video modes supported (marked in green below).

The biggest issue I'm facing is the fact that counting the vertical resolution is not a good way to find out the video input resolution. It's because some PS2 video modes have identical timings, but with different active areas (448i->480i, 512i->576i for example). This makes automatic resolution switching very difficult and I couldn't find a good way yet to manage it without some manual interaction. My first idea was to extract the video resolution information from the I2C to the video encoder, but after recording all video timings OPL supports I was rather disappointed to see that it's completely useless. Here are the results, maybe they are at least helpful to someone:
View attachment 33742

Interesting observations:
- There are some cases where the resolution can be identified by observing the values sent to registers 90h and 91h (see 480p, 720p, 1080i)
- GCONT toggles when changing the video output mode in the system config.
- There is no I2C communication when switching from PAL to NTSC, this is only handled using the NTPAL pin
- The NTPAL pin is 1 when the console outputs progressive PAL (would expect the opposite)
- At boot there is not a lot going on, the first 2 bytes might be some device id and the bytes written to B1h, 33h and 32h are just 00.
Note: the same registers are accessed repeatedly in different sequences (but same values) until booting is done

Next topic: housing!
The last comment from @StonedEdge got me thinking about the "pinkie issue" everytime I touched the first mockup I made (could not unsee that post :D). Got some more feedback from my "test subjects" and ultimately decided to improve upon the existing concept!
This is the current state:
View attachment 33747View attachment 33746
View attachment 33745View attachment 33744
View attachment 33756

It's definitely bigger than the old concept, but I think I still managed to achieve a good compromise between dimensions and ergonomics. It took way more iterations than I dare to admit, mad respects at this point to anyone designing gamepads commercially! This is a seriously difficult task, maybe even the most difficult part of the project yet. The biggest headache was that there is no straight surface apart from the front & the grips are angled in 2 dimensions. I have to say that the clay method did make it a lot easier to modify the design for each iteration, but having access to a 3D scanner would have been really sweet here!

As I said this is the current state, but everyone I asked so far agreed that it feels good, so if there is no major issue with it I tend to not change the outer design too much (might still add or remove some minor features). The inner design is also pretty defined now: I know how many parts there will be, roughly how they look, how they connect and how the device will be assembled. More about that when the time has come...
Right now working on designing rough drafts for every component to check the fitment before continuing to refine everything. The bumpers feel nice too after 2 iterations and will use the KSC222J LFS tact switches, which I found have a nice feeback while still being a bit squishy. The triggers are still open. I was leaning towards the same switches, but couldn't find a configuration that feels "right" yet. @Lightning recommended some rubber membrane switches which I will try and maybe then the triggers are acceptable too.
I was also working on the battery compartment, which will most likely use Keystone 209 battery contacts paired with 21700 cells (yes, in the new concept they finally fit!). As with the rest, the design is tricky due to how the grips are angled, but I'm having fun :D
Depending on the budget, I may or may not need to run some wires for the batteries (or have a full flex PCB ready), but my focus is on avoiding soldering as much as possible in the assembly of the unit. Here is a quick little (preliminary) explosion view of the existing draft components:

View attachment 33759

Next would be the mainboard
At 121x83mm the mainboard is certainly not as small as I would have liked it to be, but keep in mind this plugs directly into the 5" LCD and is essentially the same size. The only additional components needed to get it running would be the left and right button PCBs & a battery, that's it. The SD memory card slots will be on a separate flex PCB and sit next to the heatsink on top of the mainboard.
Component selection & placement are mostly done, I will use mid-mount sockets for audio, USB A and USB C to limit the overall thickness. For the USB A this was actually tricky, because there was literally one option available which satisfies my requirements of it being mid-mount and having no lip around the outer edge. Does this ring some bells? Yes, the stock mainboard actually has just the right USB A socket, so I will reuse it.
Right now I'm working on doing a new routing iteration to see whether all connections and placements would still be feasible on 6 layers (starting with the BGA fanouts):

View attachment 33749View attachment 33750

Still open are the speakers. The placement is defined and they will be front facing , for that I was considering using either my current ones or some switch oled speakers, but I couldn't find the dimensions anywhere to check the fitment in my assembly. The switch oled speakers are ordered, but I'm still having issues finding the part number for the connectors they use. Looking at the pictures I would say it's a JST BM02B-SURS style connector, but I would need to order some to check. Maybe someone knows?

Another thing to note about the mainboard: There will still be no bios flash directly on the board, I will use a similar module to my old design. The reason is that I still see a lot of potential in bios customizations and I would also like to install the PS2 SDK to experiment with a custom version of PS2BBL at some point. I will also expose all signals that run directly from the mechacon to the EE on the bios connector to maybe do some feasibility regarding removing the mechacon and DSP. @Epaminondas already proved this is possible, so it would be quite interesting to try (certainly not for this portable, but maybe rev. 2.0?).

Technically not part of the project, but related:
For the first mainboard revision I had problems with soldering the EE, as it's quite large and when heated unevenly the substrate tends to warp. I suspect this is why the first try didn't work, but on my current mainboard I saw that the EE substrate is warped too (the corners are closer to the PCB than the centers). In theory this could be ok, as the silicon is wire bonded to the substrate, which might allow for some more stress (took one apart to check...). Anyway, to make the assembly more reproducible in the future I was slowly designing a reflow oven in parallel to the project over the last year and I'm pushing hard to make it ready for the next mainboard revision!
It's based on the Controleo 3 and fully custom designed to fit a PS2 mainboard (+ a little bit extra). I have no experience with reflow ovens, apart from what I hear from our suppliers at work, so this could be fun to set up and work with. In a week or 2 I should get the last components and be able to start it up the first time to gain some experience with it!

View attachment 33757

To sum it up: slow progress, but I'm getting there! It used to be faster in the past when I could fully focus on the electronics, but now there are a lot of components that need attention!
Dude I swear you are a freaking legend. How come you're not around on the Discord server? We'd love to have you there!
 
Joined
Apr 12, 2017
Messages
221
Likes
147
Location
Canada
I had a very similar issue with substrate warping on the EE+GS chips I've bought. They said it was from baking the chips before shipping to remove moisture. The company was actually good enough to give me a refund (it was Kynix), but I had to argue with them for a few days lol. Either way, this hardly looks useable does it? I think not!

The one on the bottom is one from a different company for comparison, the top is the one allegedly warped from baking.

20240521_015149.png

I'm slightly worried now about them warping during reflow now though. Perhaps an incredibly slow temperature increase to make sure heating is a even as physically possible...

Your board design looks incredible! I especially like how the EE and GS are right down the middle! Super symmetrical! How many layers have you used btw?
 
Joined
Dec 25, 2022
Messages
42
Likes
408
Location
Landeck, Austria
Time for a small update!

Housing:
Over the last month I was working a lot on the mechanical design again with the goal of getting a first prototype printed for various tests:
  • assembly steps
  • fitment of front and back housing, as well as internal components
  • rigidity of the whole assembly
  • ergonomics
  • speaker testing
  • to get an impression on how an SLS print would turn out: tolerances, surface, warping,...
The outer housing was ordered from JLC today as black SLS part, the cost including shipping was around 45 bucks (was surprised by how cheap that is!).
Internally the design is almost ready for a first test fitment too. It's designed to be usable without the outer shell, to allow for testing, troubleshooting and easy shell swaps.
Here is a view of the "skeleton":

skeleton.PNG
skeleton2.PNG


jlc_housing.PNG


A big challenge was and still is the speaker design. I was trying to figure out how to best make my switch oled speakers sound acceptable, and as I have no clue whatsoever about audio engineering it's trial and error at this point :D My strict requirement is to have front-facing speakers and as there is nowhere near enough space in the front I had to offset them outwards between front shell and battery compartment. In my first try the sound was absolutely terrible, until I figured out that those speakers need an airtight chamber in the back to get any decent volume and sound quality. As there is no space for such a chamber and to leverage the power of 3D printing, I decided to merge the back of the speaker assembly into the battery compartment. During testing I noticed that the volume and shape of this chamber (obviously) have an effect on the sound, so I will have to continue tweaking the speaker assembly when I get the housing from JLC (as that makes a big difference in soundquality too, I noticed). After I got the speakers I also compared the plug with some JST datasheets and I'm fairly certain now that they indeed use JST BM02B-SURS style connectors. In my design they will plug directly into the left and right side of the mainboard, no soldering neccessary.

Next concern was the structural integrity of the whole assembly. My biggest concern were stress induced solder joint failures under the BGAs when the housing is not rigid enough, so I needed a way to make sure this thing is built like a brick in the end. Initially I wanted to use a custom CNCed heatsink in combination with a sheet metal heatspreader and 3D printed backplate for mechanical support. After studying JLCs automatic offer tool for their CNC parts I realized that a custom heatsink would not be much cheaper than a full aluminium block & heatsink combo over the whole mainboard (if designing carefully to avoid expensive geometry). This would not just vastly increase the mechanical robustness of the mainboard, it might also improve thermals and it enables me to compress 3 components into one to save some effort designing things. The current version I have would have been about 50 bucks to machine, but I'm sure I can still lower that a bit with some simplifications.

The triggers and bumpers were also quite challenging to figure out, as the space is limited and the geometry is complex. The triggers felt terrible in the beginning, because I was too focused on a hinged design. Not even the rubber membrane swiches @Lightning recommended were helping, they were too big and even using just the plain membrane didn't quite fit right. For the bumpers, relying on the plastic bending works surprisingly well and feels great (PETG is a very good choice there), so I decided to try the same concept for the triggers. There it's a bit more tricky to find the right balance and the travel is limited (which excludes rubber membraned), but it just feels great using the KSC222JLFS switches, I even prefer it over the switch triggers! Material fatigue doesn't seem to be an issue with the test results I have right now (couple thousand cycles, PETG), so this will be one!

trigger.PNG


Flex PCBs were defined too, even though the outlines are not final yet. The will be 5 in total: left and right gamepad, memory card flex, left and right battery. The gamepad and battery PCBs plug into the left and right edges of the mainboard with the batteries using a Molex slimstack series battery connector and the gamepads using Hirose DF40C series connectors. The memory card flex connects to a DF40C right next to the EE and has most front IO signals included.

Mainboard:
In parallel I was also working on the mainboard, cleaning up the schematics, defining the outline and doing the first routing iteration. The PS2 side and power management are mostly routed, the FPGA part is still missing most of the SDRAM interface and the RGB video output (needs quite some pin swapping from the last revision).

I was also defining a new implementation of loading the bitstream of the T20. For the final board I would like to be able to update over USB-C, making full use of the drag&drop programming of the RP2040. For that I'm planning to test loading the bitstream onto the T20 via the syscon's SPI, which would allow me to directly include it in the firmware. That way it's possible to update both the syscon firmware and the video processor bitstream by dragging the new file onto the mass-storage device, no debugger needed. To make this easier I also added the BOOT0 pin of the rp2040 to the pinout of the right gamepad PCB, so the bootloader can be activated without disassembling the unit. As this is all very experimental I added jumpers to disable this feature in case it doesn't work.

Reflow oven:
My reflow oven was also finished and tested. The controleo 3 luckily includes PID autotuning and a score system to rate the oven's performance automatically, which made the whole setup quite easy. I guess the 100% oven score speaks for itself :D

IMG_20240615_144831263_HDR.jpg


I did some test-reflows already using my old mainboard design, mainly to check for hot & cold spots and to evaluate the warping. The temperature variance over the usable area of the PCB tray reached about 20°C max when running a lead-free profile. Warping of the EE and the mainboard in general is also reduced, I suspect it's due to the more even heating and the slower cooling time. Another thing I noticed is that the reflow is much smoother (more accurate temperatures on the tray, less temperature variation) when taping the thermocouple to the PCB, instead of it hovering in the air above it.
This looks quite good now! The only questions left are:
  • how does it perform for double-sided loads?
  • which profile would be needed for mixed leaded and lead-free assembly (thinking about the T20 and SDRAM)
IMG_20240616_215616969.jpg


Boot rom flex:
After a discussion with @Lightning I was motivated to order a flex PCB I designed last year to mod my personal 90k PS2. Sadly I didn't double check what I designed back then before ordering and the design contained an error, which required a bodge wire to work. So second revision it was!

IMG_20240711_193650750.jpg
IMG_20240711_193704565.jpg
IMG_20240711_195056536.jpg


IMG_20240711_195132490.jpg


The idea is to fully replace the stock boot rom with this flex PCB and to either flash it with a 79k bios (to enable fmcb autoboot) or with a 79k bios patched with ps2bbl (to autoboot into OPL, without fmcb entirely). I guess it's quite cool for 90k consoles... Theoretically it could even work on any other PS2 revision that uses the same TSOP-II-50 boot rom (using the respective rom versions obviously). Possibilities are endless in theory, this is just the beginning and there are probably a lot more uses than I could ever think of! For my personal 90k this was certainly the easiest and cleanest modchip install ever. With the latest version of ps2bbl it might even be possible to launch OPL directly from MX4SIO, currently it needs to be either on a USB or a memory card. I will clean up the design files a little bit and maybe I'll attach them to the next update...
Big thanks at this point to @Epaminondas and @mixa232323, this mod would not be where it is without them!

Now to the disadvantages: DVDs will not play, as this mod removes the DVD player portion of the bios chip entirely. I tested this on my 90k and can confirm that PS2, PS1 and CD disks play just fine, but as soon as you try to launch a DVD in the brower, the error message below pops up (sadly no smoke, no fire, it just throws you back into the browser).

IMG_20240630_102145814.jpg


EDIT: The project files for the boot rom flex are attached. You have my written confirmation to do with them whaterver you want! Have fun!
 

Attachments

Last edited:
Top