Sunday, 17 April 2016

A Compact Flash Interface for the Microbee

One of the issues with my Microbee, and probably an issue that plagues many owners of very old microcomputers, is that of getting programs and files on and off. Old computers use floppy disks, or cassettes, which simply aren't supported on modern systems.

I've already covered some of the ins and outs of getting cassette data on and off a bee using an ordinary soundcard on a modern computer, resulting in being able to play games on a "cassette" bee. The next logical step is disks.

Microbees are able to run CP/M, a predecessor to DOS, which allows you to read and write floppy disks (and even hard disks), and run quite a lot of off-the shelf software, like Wordstar.

Some of my bees have floppy disk controllers, and are thus able to run CP/M. I have a couple of 3.5" double density disk drives, plus a box of blank disks and a single solitary CP/M "system" disk. My system disk is a bit iffy. I've been unable to make other system disks from it as I believe the "setsys" program is broken.

As with tapes, there is a cornucopia of MicroBee CP/M software on the internet, most notably at the Microbee Software Preservation Project. The issue is getting the software off the internet and into the Microbee. Disk drives are expensive, physically large, and fragile. The disks they take are getting increasingly difficult to get. There are, however, other ways to store disk data.

The obvious one is Compact Flash. Compact Flash cards are still widely available, due to their popularity with high-end cameras. They have an "IDE compatability" mode, whereby they pretend to be a hard disk. Late in the life of the Microbee, a hard disk model was produced, so there's a "BIOS" available for hard disks. There's been quite a bit of activity on the Bee Board (a predecessor to the MSPP) and the MSPP in getting Compact Flash cards to work, with reasonably mature software, thanks to the efforts of Kalvis. I also built some hardware about ten years ago, which never really made it to prime-time, as it was very flaky.

So early this year I figured I'd resurrect these efforts, with the goal of making an accessible "coreboard" that could replace the memory board on most-any Microbee and allow the machine to boot CP/M from Compact Flash. The Compact Flash card could then be read and written to on a PC, facilitating easy file transfer, both to and from the bee, and making a machine that's straightforward to play with and nicely self-contained.

There are a bunch of different Compact Flash interfaces already built for 8 bit systems. I've based mine on "GIDE", by Tilmann Reh. GIDE is supposed to go in the socket for your Z-80 CPU, allowing pretty-much any system with a Z-80 to have an IDE interface.

A google search on Compact Flash interfaces for 8 bit systems will show the degree of frustration that people have in getting them to work. Some cards work beautifully. Others just refuse to read, or write. I dove straight into these frustrations and I think I've worked a lot of the issues out.

Anyway, firstly, the hardware. My IDE interface uses a pair of 74HC646 registered transceivers, as per GIDE. These chips allow us to latch data going to and from the CF card, such that we can talk to a 16 bit card with an 8 bit CPU. Some (many?) cards are able to be put into an "8 bit" mode, but from reading accounts on the net, this isn't guaranteed to work from one card to another. In any case, the 16 bit interface is supported by all cards, as it's part of the IDE standard.

GIDE uses a pair of PALs to create all the enables and clocks for the registered transceivers, as well as do IO port address decoding. I rolled both these PALs into one Atmel ATF1502ASL CPLD. These chips have 44 pins, are available in a reasonably friendly PLCC package (through-hole via a socket), and run at 5V, so they play nice with the rest of the Microbee without mucking about with level shifters etc.

Here's a schematic for my IDE/CF interface:

There's just three chips involved: a pair of 74HCT646 registered transceivers, and the CPLD.

While I was laying out a board, I also added memory (up to 512K of RAM and 128K of EPROM), and a floppy disk controller:

The whole lot is laid out in a simple 2 layer board, with 12 thou tracks and clearances:

Next I had a bunch made. I used iTead, a Chinese low-volume board house. I was very impressed with the quality and price of the boards, but they took ages to arrive. Much wall climbing ensued. I then set about assembling a couple. The only challenging bit is the CF socket, which has pins on 0.635mm centres.

Getting it going started with first building a simple "SRAM" memory management PLD, which makes it pretend its a normal static RAM coreboard, with 32K of RAM and 24K of ROM. Once I had this working I got the floppy disk controller running, then went to work on the CF interface:

This was a lot harder than I anticipated. I got it working after a fashion, but it was very touchy. Probing things killed it. Touching ICs killed it. It was just really difficult. I started by porting GIDE to CUPL, and implementing it in the ATF1502ASL. Much of the touchiness with my CF card (A 64MB Sandisk one) was related to iord and iowr. These signals gate data to and from the card. On a PC, they connect pretty-much directly to pins on the 8088 CPU. GIDE uses the logical AND of rd and iorq to generate iord (all negative logic), and the logical AND of wr and iorq to generate iowr. the chip selects are generated by the logical AND of address lines and iorq.

What this means is that cs and iord/iowr happen synchronously with one another, and as I was about to learn this isn't necessarily good. After much frustration I found the Sandisk CF manual, which shows timing diagrams for "PIO mode IDE":

If you look really carefully, you'll see that cs must be asserted _before_ iord or iowr, and that it has to be held active _after_ iord or iowr are deasserted. Our simplistic method of simply gating everything with iorq just won't cut it.

Things got a whole lot more reliable once I removed iorq from the chip select logic. That ensured that chip select was asserted well before iord or iowr, and held active well after. It was still a little problematic though in that the chip select was activated for both IO and memory accesses, where it should really only be active for IO accesses. Also, further reading of the manual says that chip select should only be asserted _after_ the address lines are valid, not at the same time.

The solution to this lies in more logic. The Z80 (Microbee processor) holds iorq, rd, and wr valid for just over two clock cycles, asserting them just after the start of the T2 clock cycle, and deasserting them after the mid-point of the T3 clock cycle. Note there's always an extra "wait" clock cycle in the middle:

So if we create a cfiorq signal, asserted on the negative-going clock edge after iorq, and deassert it exactly two clocks later, using this to generate iord & iowr, we get the timing just so. This is done with a three-state state machine:

/* cfiorq state machine - generates a 2 clock pulse starting first low going clock edge after iorq */

state0.d = iorq & !state0 & !state1 # state0 & !state1 ;
state0.ck = !clk ;
state0.ar = rst ;
state0.sp = 'b'0 ;
state1.d = state0 & !state1 ;
state1.ck = !clk ;
state1.ar = rst ;
state1.sp = 'b'0 ;

cfiorq = state0 # state1 ;
Then iord & iowr are:

/* iord */

iord = tfradr & cfiorq & rd
# datadr & !lh & cfiorq & rd ; /* only assert iord for task file, cs1, and first data read */

/* iowr */

iowr = tfradr & cfiorq & wr
# datadr & lh & cfiorq & wr ; /* only assert iowr for task file, cs1, and second data write */
I've further restricted them such that they're only activated for specific accesses to the CF which probably isn't strictly necessary, but hey, it works.

So here's what my timing looks like in the flesh, on my lovely HP1652B logic analyser, which is nearly as old as the bee. Firstly a single 8-bit read, of the status register on the CF:

Accesses to the 16 bit data register make use of the GIDE lh signal, which toggles between bytes. Note that there's only one chip select and iord/iowr for every two bytes, at the first byte for the read, and second byte for the write. On alternate bytes the 74HC646 registered transceivers are clocked:

And finally here's my little bee, covered in probes to capture these waveforms:

The result of this is an interface that works with most every Compact Flash card I can throw at it.

In any case, more detail at the MSPP, including source code for the PLDs, firmware for the ROM, and Kalvis' wonderful CF CP/M.

Friday, 4 March 2016

Frogger

Here's a very typical example of getting an old cassette based game to run on the MicroBee. In this case the original came to me already as a sampled .wav audio file, but in the past I've done the sampling myself from (often very poor quality) cassette tapes.

So I start with froger-j.wav, downloaded from the Microbee Software Preservation Project website.

It's a 22.05 KHz sampled mono wav file. This is problematic, as the MicroBee cassette standard wants 1200 and 2400 Hz tones to represent binary 0's and 1's respectively. Neither 1200 nor 2400 divide cleanly into 22050, so the resulting waveform has a lot of timing jitter, which a real MicroBee hates.

Trying it on a physical bee using my Macbook as a tape drive confirms it's not a goer. The bee doesn't even get as far as detecting a valid header.

Next step is to have a go at decoding the file on the mac. Many years ago I wrote a couple of simple utilities, wav2dat and dat2wav, which convert .wav format files to data and vice-versa. They were subsequently picked up by other more talented programmers (Kalvis), who made real utilities out of them rather than the buggy I-is-coding! stuff that I write. They're rather more forgiving than a real bee, so generally have no dramas reading less-than-perfect waveforms. Sure enough, wav2dat converted the file easily, and dat2wav converted it back to a clean 9600 KHz sampled .wav file. The audacity plot below illustrates this. Because the sample periods don't line up cleanly with the data, there's a lot of zero-crossing jitter on the top (22 KHz) trace. The other two traces are 9.6 KHz versions, and they're lovely and clean.

So trying that on the real bee gets an actual file that loads. Alas after loading we just drop back to a prompt rather than running anything. This sort of behaviour is typical for copy protected games. It was popular to mess around with bits in the header, so if someone made a copy using a monitor program, for example, it wouldn't work.

But we've got much more than a monitor. We've got the tools to make any waveform we like. So comparing the header in the cleaned up file (bottom trace) to the header in the original dirty file (top trace), the differences are obvious:

The middle trace is what happens next. I brute-force edited the waveform in audacity to change the bits, so my cleaned up version has the same header as the original game.

So now, when I load the game, I get it to run, but still something is seriously wrong:

The key to what's happening here is in the size of the garbled character cells at the bottom. They're mostly square, rather than the usual very rectangular character cells that MicroBee generally uses. It looks like the programmer has run the screen in a 64 x 32 character mode with 8x8 characters rather than the more normal 64 x 16 character display with 8 x 16 characters. This is okay, as there are 2K bytes of screen RAM, enough for the 2048 characters that results. It's wasteful of PCG RAM though, as half of the PCG RAM can't be displayed.

Colour (and Premium) MicroBees expect the programmer to initialise the screen colours though. This game was clearly written before there were colour bees. The top half is initialised by basic, as that's part of the normal 64 x 16 screen. The bottom half's colours aren't initialised, so the colours are garbage. Not to worry. We can do this in basic before we load the game.

so the following code clears the top half of colour RAM to green on black, same as the first half:

10 out 8,64 : REM Enable colour RAM in top 2K of memory
20 for i=63488 to 65535 : poke i,2 : next i : REM clear all colour RAM to green / black
30 out 8,0 : REM back to normal PCG
So if we run this, then load the game again, we're in business:

Tuesday, 23 February 2016

Microbee Cherry keyswitch adapter boards

Here's what happens when you're out at work for the week and a box containing 500 little tiny circuit boards arrives, so you ask your husband to take a photo. No prizes for guessing what Perry's into...

What it is is another in the steps needed to make my Microbee keyboards totally wonderful and reliable. This little PCB goes between the keyswitch and the microbee baseboard, correcting the PCB layout.

Sunday, 14 February 2016

Designing a new Microbee

One of the neat things about the Microbee, and I guess for me the reason it has enduring appeal, is that it's design is wide-open and freely available, and was right from the start in 1982. The bee was originally conceived as a kit computer, and details of the kit, including a comprehensive "how it works" section were published in magazines at the computer's launch. Applied Technology, the makers of the Microbee, initially made their money selling electronic components for hobbyists, so it was in their financial interest to ensure that their computer was as open and well understood as possible. People would then play with it and buy parts off them to do so.

This was the heyday of electronics hacking. I remember as a teen going to "computer fairs", where hobbyists displayed their toys alongside the rapidly burgeoning industry reps, who were probably hobbyists just a few years previously, playing with S100 systems and suchforth.

So Microbee was never Apple, Commodore or Atari. There was no money for custom silicon, and that's good. The problem with custom chips is that they're built for a specific task, have a lifetime of perhaps three years before they're obsoleted by the next custom silicon, and then they're out on the scrap heap. No decent documentation ever gets published for them, as those developing the silicon are frightened that their rivals will steal all their IP. An example of this is VGA. VGA is so much more than a 15 pin connector on the back of old PCs. Its a whole display hardware system that was developed by IBM in the mid eighties with the launch of the "AT" computer, and extended the graphics potential of PCs considerably. It made windows possible.

But try finding documentation for VGA cards. Schematics. Google gives connector pinouts, scan frequencies, mode tables. But absolutely nothing on the inner workings of a real VGA card. Modern emulations are just a brute-force reverse engineering of the card. This data in these registers gives these results.

The bee is different. It's graphics are open, based on the Rockwell 6545 (a very close relative of the 6845 used in early IBM graphics cards) CRT controller. Because Applied tech couldn't afford silicon, the whole design is there for us to see and play with, right there in front of us. So let's play with it.

The objective of the exercise is to extend the Microbee's video hardware so that it's capable of playing Pacman. Not a stripped down game that looks a little like Pacman (ghostmunchers), but actually write the game on the bee and have it look and feel identical. The bee was never able to do this due to basic hardware limitations.

So let's have a look at how the bee's video hardware evolved over it's lifetime. The bee started life as a kit in '82, based roughly on a couple of S100 cards that Applied Technology were selling at the time. The DG640 was the basis of the bee's video hardware. Pre microcomputer, people used serial terminals to talk to computers. The serial terminals had a rudimentary screen and keyboard, and the computer did all the processing. Everything was very-much character based. The terminal was essentially a glass typewriter.

People didn't tend to own serial terminals in '82, so they included one in the design of the kit, that could use a modified television as the screen. This is where the 6545 comes in. The 6545 was designed as a CRT controller for terminals.

So the design of the video hardware in the original bee closely follows the standard terminal application note for the 6545. The CRT controller chip is connected to "screen RAM" which contains the ASCII value of each character position on the screen. The eight output bits of the screen RAM select a character from a character generator ROM, with it's least significant 4 bits being directly driven by the 6545. The resultant 8 bits is serialised by an 8 bit shift register, and the output data quite directly drives the intensity input of a CRT. The 6545 has a bunch of counters in it to generate the screen RAM addresses and the row addresses for the Character ROM.

Early memories were really slow. Typical static memories would do their thing in around 250ns. The access time of 250ns for the screen RAM, then a further 250ns for the character ROM or PCG RAM dictates the overall resolution. Each screen + PCG access yields 8 bits of data, stuck into a shift register, so we can generate a pixel every 500/8 = 62 ns, plus a bit for other logic. This equates to a maximum dot clock of 16 MHz. Microbee went with 12 MHz initially, and then upped the speed to a whopping 13.5 MHz for later models. At a screen redraw rate of 50 Hz, this equates to an absolute maximum of 320,000 pixels. Allowing for retrace we get quite a bit less than this. Microbee initially went with a 512 x 256 screen (131072 pixels), and later with a 640 x 275 screen (176,000 pixels). Applied Technology realised that ASCII only contains 128 characters, and the 8 bit output of the screen RAM could address 256 "characters". So they included a 2K x 8 RAM as well as the 2K x 8 character ROM, which could be loaded with values by the CPU. This "Programmable Character Graphics" (PCG) RAM allowed high resolution but incredibly limited graphics. The Microbee as originally sold had a graphics resolution of 512 x 256 (128K pixels), but there was only enough PCG RAM for 1/8th of this.

Soon after it went on sale, a colour mod was developed. The colour mod patched a second 2K x 8 RAM alongside the screen RAM. This gave eight further bits per character cell, which could define a foreground and background colour. Early colour bees used a rather strange 32 foreground (5 bits) and 8 background (3 bits) scheme, but with the release of the premium model they went with a more usual 4 bits (RGBI) foreground, 4 bits background scheme.

But there were still only 128 PCG characters, so there were no pixel addressable graphics. Late in the Microbee's life Applied Tech redesigned the bee mainboard and fixed this. They added a third 2K x 8 RAM in parallel with the screen RAM, for a total of 2K x 24 bits of screen RAM. Now 11 bits are used to select one of 2048 PCG characters, plus 8 bits used for colour, and the remainder essentially wasted, (used for flashing characters, inverse etc).

The mainboard was getting pretty crowded with all this logic. There's three RAM chips for screen, plus up to four chips for PCG, plus a flock of multiplexers and buffers to allow either 6545 or CPU to access screen and PCG memory.

The whole time the memory is still running at the same speed. Every character cell (1/8th dot clock) we access screen RAM, then we access PCG, then the data gets serialised. And the graphics is still essentially monochrome. Each pixel gets to choose either the character foreground or background colour, so if we want to render red, white, and blue in subsequent pixels we're out of luck.

But even in the late eighties, when Microbee went belly up due to the onslaught of cheap PC clones, memory was faster than this. There's a design in this, and it doesn't have to be monstrously complicated with thousands of memory chips.

Rules are necessary for this design. First, no SMD. Everything's gotta be through-hole, or at least through via a socket (ie PLCC). PALs and GALs are fine.

So we start by ditching two of the three screen RAMs. In order to get our 24 bits of screen data, we access one RAM three times each character clock, latching the data each access in a simple 'AC574 octal flip-flop. Three isn't a binary division though, so lets do four. The last access can be used for CPU, so we don't make the CPU wait until a retrace period to access screen RAM. At 13.5 MHz dot clock, this sin't stressing the memory at all. Each access is 148ns, easily doable with contemporary 120 or 100ns RAM.

So each character clock, we do screen, then attribute, then colour, then CPU. We've got enough data after attribute to do a PCG read, so the PCG access starts concurrently with the colour read.

We can do the same thing with PCG to allow us to read four planes of PCG RAM in a given character clock (one each for R, G, B, and I). That doesn't allow us to do any CPU accesses though, so if we push the PCG RAM out to 16 bits rather than 8, we get all our data in just two reads, leaving one for CPU, and one for... Hey, let's add a blitter!

A Blitter is logic or a processor that just moves things around in memory. It's really useful for graphics. One can give it data for a sprite or window, and ask it to draw it in various screen locations. In it's simplest form it's just another CPU that has access to video memory.

Now having three things trying to access one address bus is a pain. The usual Bee uses multiplexers (74LS157s) to select between 6545 or CPU addresses. Multiplexers don't scale cleanly, so let's use tri-state buffers instead. In the 6545's case it makes sense to use tri-state flip-flops for it's address lines, to ensure they're valid for the whole cclk cycle.

We still end up with four RAM chips though, simply because 64K x 8 RAMs aren't a thing.

Here's the whole lot diagramatically:

I know it looks complex, but thanks to the faster RAM and tri-state buffers rather than muxes, it's actually on a par with the Premium Bee's chipcount. The whole video memory array, including address and data steering, is 27 chips. The same circuit in the Premium bee is 26 chips.

So continuing the design exercise, let's build a new bee mainboard around this video memory array. We'll use a Z8S180 processor, because these are compatible with the Z80 but go at like 33 MHz rather than 4 MHz, plus add video game sound.

The whole lot is several sheets of schematics. First our video memory:

The CRT controller and Keyboard:

The CPU:
And finally the PIO, RTC and sound chip (a TI SN76489, which makes lovely eighties video game noises):

And to prove that the whole lot is doable, here's the layout, having hit the autoroute button. The PCB is 12 x 8.55 inches, just a smidgen bigger than the early Bee (12 x 8.4), but smaller than later Premium bees (13.4 x 8.55). I've done it in four layers (the mid layers are power planes), with 8 thou tracks and spaces, so it's a doddle to manufacture.

There's a total of nine GALs, where the premium bee used just one. The blitter isn't actually implemented - the idea is to do this on a second board, either integrated with a super-coreboard or else underneath. The coreboard sockets are a superset of the standard bee ones, at 32 pins rather than 25. This allows for the extra address lines of the Z180, as well as some more ground pins. Standard bee coreboards plug in just fine, using 25 of the 32 pins.

Monday, 25 January 2016

Wednesday, 30 December 2015

Making the most of your eighties computer.

Before the rise of the PC in the mid eighties, there was much more diversity in home computing. Australia even had it's own home-grown computer, called a Microbee, designed and built in Sydney. I was a teen in the eighties, and a 32K Microbee "Personal communicator" was my first real computer. It made quite an impression on me, and to this day I'm really fond of them.

Technological advances are brutal, so old computers get ruthlessly discarded. This is bad, because it means that we are in very real danger of losing an important part of our heritage. Think about it. Who has a floppy drive on their computer? How about a 5.25" floppy drive? How about 8"...? So if you had an important piece of data on a 5.25" floppy, what would you do?

People care, and there's a movement to preserve old digital things. One that I've made some contributions to is the Microbee Software Preservation Project, a group of people who collect, digitise in modern formats, and distribute everything they can get their hands on relating to the uniquely Australian Microbee.

The primary way to keep this stuff alive is to play with it now and then. People have written emulators, such as NanoWasp, which even runs in a browser. There's even a VHDL description of a bee, which is way cool. But I like physical hardware (being an electronics nerd and all). So I've got a few old Bees, one which uses tape and a couple which use old floppy disks. The tape ones are the most easily accessible. Plug into power, plug into a monitor, load a tape, play games. Just like it's 1983 again.

So if you're like me, you'll have stashes of the computers and software, but not a lot of the other paraphernalia that goes with it, like disk drive, monitor, printer, etc. Monitors and disk drives are heavy and fragile, and take up a lot of storage space, so they're usually the first bits to go to landfill.

Not all is lost. You can still play with them, at least with Microbees. A television makes a great monitor, and your PC (mac in my case) makes an awesome tape drive. Go check out the MSPP site for .wav encodings of Microbee software, that will play (making an awful noise if you forget to plug the bee into the speaker output) on your PC. I find full volume works best.

Microbees used a video standard called composite. Back in the early eighties you could buy composite monochrome monitors, which hooked up to the computers of the time and displayed lovely green (or sometimes amber) text. These were rapidly replaced by CGA, then EGA, then VGA, then HDMI, so finding a proper composite monitor is hard.

Not all is lost though. Your DVD player and set top box also have "composite" connections, which are close to those on old monochrome computers.

So, let's plug the Microbee into a modern television using the "AV" connection, which is intended for your old DVD player. The plug is the same as the "AV" socket on the back of the telly, and we straight away get an image! It works, kinda. Not quite as nice an image as a real old monitor gives, but pretty close. Here's what a small portion of the screen of my basic Samsung "720 HD" LCD television looks like playing a classic Bee game:

Now the differences between a television and a composite monochrome monitor are that the television is designed to extract intensity (called luminance) information colour (called chrominance) and sync (telling the screen when to start a new raster scan and when to start a new line) information from the same physical bit of wire. The monitor doesn't care about colour (chrominance), so it just encodes luminance and sync.

So there's no colour information. That's why it's white. I played with my television a bit and worked out that I could convince it that my monochrome data was in fact green. It's in the "advanced" display settings:

And that's made some improvement. But it's still pretty blurry.

For those of us in PAL countries, the colour information is encoded using the PAL standard. This puts the chrominance data on a sub-carrier, at 4.43 MHz, and has an audio carrier at 6 MHz. Importantly, the standard is designed to provide a video bandwidth of just 5 MHz. That's plenty for DVD video, but not quite enough for the Microbee display.

Working out the Microbee's display needs is straightforward. It uses a "pixel clock" of 13.5 MHz to shift out pixel information. The highest video frequency is achieved when pixels are alternately on and off. They're clocked out on the rising edge of the clock, using a 74LS166 shift register. So the video data coming out of the bee is at half the frequency of the pixel clock, or 6.75 MHz.

Now, 6.75 MHz is a tad higher than 5 MHz. That's why it looks, well, soft. Incidentally this is why early computers that were made to display on televisions typically only displayed 40 columns of text. You just couldn't fit more in the PAL bandwidth limit. And for the yanks, NTSC is even worse.

My telly has a "component video" input, that's made for newer DVD players. That uses three wires to convey the red, green and blue information, plus encoding the sync signals on the green wire. Because it's no longer PAL, there's no 5 MHz filter. What if I were to fool the television into thinking I have a component input rather than a PAL one?

Turns out this is easier than I ever imagined. I just plugged a cable into the red and blue inputs on the telly, and it decided it had a component input. The green one still goes to the standard "composite" signal from the bee.

Interestingly, we're back to displaying white, but much sharper white than previously. The television notices there's no data on the red and blue inputs, so defaults to a "luminance chrominance" mode where the luminance is on the green wire and chrominance is on the other two. Not to worry, we can force it to use RGB, in the same method as previously. And the results are spectacular. A display that's every bit as good as I could get on a high-end composite monitor back in the day:

Zooming back out shows my whole bee, ready to kill centipedes:

Wednesday, 23 December 2015

Varnish

Varnishing isn't easy. It takes a while to get good results. The good thing about it is you can always just apply another coat. I think I'm getting better at this, after (thinks!) six coats on my rowing thwart, I've got a reasonably good recipe that looks set to provide a tolerable finish, at least after a few more coats...

My ingredients are Feast Watson spar varnish, Penetrol and real gum turpentine to help the stuff flow, and a proper varnish brush, which is wide and very thin, so it doesn't hold too much varnish, with super smooth bristles, so it doesn't leave great big ugly brush marks. I'm thinning the varnish out with ~15 percent penetrol and a further 5 odd percent turpentine. That gives me a mix that flows out nicely. Of course that's a recipe that's highly dependent on environment, brush, technique...

I started sanding with 180 grit, but found 400 works better in the latest coats. Here's the rowing thwart thus far. There's a bit of general lumpiness but the gloss level I'm getting is fairly good:

My rowlock bases and tabernacle have had a couple fewer coats. In the case of the rowlock base that doesn't seem to be an issue, but the tabernacle still needs some love.

I find when I'm putting it on it's good to keep a little container with mixed varnish+penetrol+turps, plus one with half an inch of pure turps in it handy, so I can thin out and clean the brush periodically, to keep it from sticking to everything.

Sunday, 20 December 2015

Camouflaged cat is camouflaged.

See if you can spot the cat hiding in this picture:

Of course like all good cats, Mogget's goal in life is to ensure his paw prints are in all varnish.

Saturday, 12 December 2015

Finished the coamings

After gluing the coamings in place, the next step involved trimming them so they were the correct size. This involved a process not-unlike trimming my fringe. Take a little off one side, look at it from afar, take a little off the other side, look at it from a distance, take some more off... Luckily I managed to stop myself before I reached the deck.

Then I sanded things smooth, coated with epoxy + filler (this 4mm ply has a pretty crap open-grained face ply, which swallows epoxy), then a couple of coats of unthickened epoxy, then sand down to 180 grit, and finally toplac paint.

Here's what it looks like tonight.

The shaped bit in the bow is to allow clearance so I can flip the forward thwart hatches over. The coaming is about 65mm above the deck at the bow, and 22mm above deck where I'm likely to sit on it.

Next job is to complete sanding the decks out to 180 grit and then paint them with top coat. No, I'm not using undercoat. I really dislike the stuff - it clogs emery way too fast for my liking.