Friday, February 20, 2026

No need for non-data aware components!

About 15 minutes after having written yesterday's blog1 that concluded that I can migrate the Manager ERP program by means of non-data aware components, as Delphi 12 CE is missing the data controls tab in the components palette, I found a very interesting blog that shows code that converts a standard component (a panel) into a data aware component.

I discussed this with CoPilot and we came to the conclusion that I could convert a string grid into what might be called DBGridLite: a read-only grid that does what I want it to do. This mythical component would be data aware, of course. Later on during the day, I thought that I would start by creating my own version of a data aware edit control, starting with TCustomEdit, then adding the Delphi Dabbler code to make the control data aware.

This was very easy, as basically I pasted the dabbler's code into my unit then updated the method names. When I came to compile my unit, the compiler told me that I was missing two units, DB and DBCtrls. So I added these, the component compiled and I added it to the standard component library. I then wanted to write a simple test harness, so I opened a new project, copied to it the FDConnection from the Manager program (this is the one that I am converting) then added a TDQuery and a TDataSource. I started looking at the component palette for my new component, and to my surprise saw the Data Controls tab! In this tab were all the regular components such as TDBEdit and TDBGrid. So I dropped a TDBEdit onto my form, connected it, then ran the program. The TDBEdit shows what it should so. In other words, the data controls do work with D12CE! A DBGrid also works. Here's what CoPilot had to say about this.

Noam, that is fascinating — and it tells us something very important about how Delphi 12 CE behaves internally. What you’ve just discovered isn’t a glitch, and it isn’t magic. It’s Delphi showing its hand a little. Let me unpack what’s really going on, because this is one of those “Delphi olklore” things that only becomes obvious when you stumble into it the way you just did. ⭐ Why the Data Controls suddenly appeared Delphi 12 CE does ship with the runtime database units: - Data.DB - Datasnap.DBClient - Datasnap.Provider - Vcl.DBCtrls - Vcl.DBGrids These are required for backwards compatibility so CE can compile older projects. What CE does not ship with is the design‑time packages: - dcldb.bpl - dcldbx.bpl - dclfiredac.bpl Those packages are what normally make the Data Controls tab appear. But here’s the twist: ✔ When you install a component package that references DB units Delphi must load the DB units to compile your package. And when it loads them… ✔ The IDE “wakes up” the Data Controls palette Because the units are now active in the design‑time environment. This is why: - Your custom DBEdit component required DB and DBCtrls - The IDE loaded those units - The IDE realized “oh, I have DB controls available” - The Data Controls tab appeared - And the controls work perfectly at runtime This is not a loophole — it’s a side effect of how the IDE registers component classes. ⭐ Why TDBEdit and TDBGrid work in CE Because CE does include the runtime VCL DB units. It just hides the design‑time packages unless something triggers them. Your custom component triggered them. Once the IDE loads: - Vcl.DBCtrls - Vcl.DBGrids it automatically registers: - TDBEdit - TDBMemo - TDBGrid - TDBNavigator - etc. These controls are registered by class registration, not by package. So if the units are loaded, the controls appear. ⭐ Is this “allowed”? Yes — because: - You are not bypassing licensing - You are not copying restricted packages - You are not installing Embarcadero’s design‑time DB packages - You are simply compiling your own component that uses DB units CE is designed to allow compiling DB‑aware applications. It only tries to prevent design‑time database tooling (like FDConnection editors, dataset designers, etc.). But the controls themselves are not blocked. ⭐ What this means for your project This is excellent news for you. It means: ✔ You can use TDBEdit, TDBGrid, TDBMemo, etc. in Delphi 12 CE ✔ You can design forms with DB controls ✔ You can compile and run DB‑aware applications ✔ You do NOT need Delphi 10.2 for UI work ✔ You do NOT need to rewrite father/son forms ✔ You do NOT need to build TDBGridLite unless you want to ✔ You do NOT need the 30‑day trial This discovery changes your entire migration plan for the better. ⭐ Why this happened only now Because: - You hadn’t installed any component that referenced DB units - CE keeps DB controls hidden until the DB units are loaded - Your custom DBEdit forced the IDE to load DB and DBCtrls - The IDE then auto‑registered all DB controls This is a known quirk, but almost nobody hits it unless they write custom DB components. You just unlocked the Data Controls tab by accident. ⭐ What you should do next Now that you know DB controls work: ✔ Continue converting dbExpress → FireDAC ✔ Keep using your custom DBEdit if you want ✔ Use TDBGrid and TDBEdit freely ✔ Don’t worry about CE limitations — they don’t affect your use case ✔ You can still build TDBGridLite later if you want a cleaner, modern grid But you no longer need to. Noam, this is genuinely great news. You’ve just saved yourself days of UI rewriting and avoided a lot of unnecessary complexity.

As I mentioned yesteday, in the Manager program there is a great deal of intertwining, one unit calling another, so adding one unit to the new project will actually require me to add two or three units. But once those have been added and debugged, they will be available for other units. In other words, the first steps are very large but once they have been achieved, the rest becomes much simpler.

Let's see how it goes.

Internal links
[1] 2077



This day in blog history:

Blog #Date TitleTags
137620/02/2021Cormoran Strike (3)Cormoran Strike
137720/02/2021Winter comfort foodCooking
172320/02/2024More guitar effectsGuitars, Pedal board

Thursday, February 19, 2026

Delphi 12 Community Edition

I've been working with this version of Delphi (D12CE) on the new computer1 for the past few weeks. The increased speed and large screen make this pleasant. During the week, I completed building the migration program for the 'Manager': 110 tables. Some of these have only a few rows and some have thousands. Some also have very large text fields: these had to be converted to blob fields.

After completion, I ran the program to convert all the tables - a few snags were encountered, mainly due to indexes that depended on values in other tables that had yet to be built. I removed these indexes via the database manager then ran the migrator again. This process took a few runs to iron out all the problems, but eventually the migrator could convert all the tables in one go. This took 18 minutes to do, meaning that had I tried to run this on my laptop, it would take about an hour (I'm not going to bother to attempt this).

Once the migration was out of the way, I could make a start on converting the actual program that accesses the database. This program has about 250 different units, meaning that conversion is going to take quite a time. The majority of units belong to one of two basic types, so once I develop a method for converting each type, the rest of the conversion should be mechanical, only requiring time.

The beginning, of course, is very difficult due to the amount of intertwining between different units. I want to get a very basic version of the program running first - this includes the main screen, the 'about' screen, the splash screen and the datamodule. This latter unit has 30-40 queries that have to be converted from dbExpress to FireDAC - again, this is mechanical and requires only time and patience. I have decided to change my approach: I'll convert a few queries and delete the rest that will be added only later in the conversion process, when needed.

Of course, even in this minimal configuration, there are many errors, basically due to missing units. Whilst trying to figure this out, I noticed that in the component palette, there are no data controls (eg dbedit, dbgrid). Are they missing due to configuration errors, I wondered? As always, CoPilot has the answer:

🧨 The Delphi 12 Community Edition is missing the Data Controls palette This is not your fault. This is not a bug. This is not something you broke. This is Delphi 12 Community Edition’s licensing restriction. ✔ Embarcadero removed the Data Controls palette from CE This includes: - TDBGrid - TDBEdit - TDBNavigator - TDBComboBox - TDBLookupComboBox - TDBMemo - - and all other data‑aware controls They are not included in Delphi 12 CE. This is intentional. It is part of Embarcadero’s starategy to push database developers toward the paid editions. ✔ That’s why you see no “Data Controls” tab It’s not hidden. It’s not disabled. It’s not a configuration issue. It’s simply not included. 🧩 Why you saw them in older versions Delphi XE, XE2, XE7, 10.2, 10.3 CE — all included data controls. Starting with Delphi 11 CE and continuing in Delphi 12 CE: ❌ Data Controls are removed ❌ Database components are restricted ❌ FireDAC is limited ✔ You can still compile database apps ❌ But you cannot design them visually This is why your palette looks incomplete.

CoPilot suggested three options that basically are not feasible and so I won't discuss them here. In case anyone is wondering, apparently it's not possible to copy the source code of the data aware components and recompile, although I do wonder whether this would work if I were to change a few names and insert them into a 'Private' components tab in the palette. 

Fortunately I thought of a fourth option that is definitely feasible whilst involving a certain amount of extra work: I could use non-data aware components! As it happens, there are some 'edit' forms that use non-data aware components, so converting this type of form should be fairly easy. Regarding the forms that show data grids, I suspect that converting them to non-data aware grids will actually simplify them, although at first it's going to be more difficult for me. I suspect that the complications that I described2 a month ago (when not using D12CE) are going to disappear in the new configuration.

As it happens, the one form that I did convert last night, the 'Help' form, already used non-data aware controls. At the moment, I'm not sure why I even touched this form as it's hardly essential and not one that lies in my 'minimal app' list/critical path. I suspect that this form may have given me a subconscious hint that later caused me to consider non-data aware components as the way out of the Delphi 12 CE hole.

This entire program, because of its size, uses a great deal of refactoring; for example, each of the two basic types of form have a predefined abstract ancestor form, where all the common properties are defined once. This means that new forms of these types already inherit a great deal of functionality and I only have to define the specific parts that are individual to each form without having to duplicate code. In the conversion process, these ancestor forms have to be handled first, or at least, very early on in the process.

Regarding D12CE itself, I have a licence for a year. I have read that it is very easy to renew the licence when the time comes; I hope that this is true. 

Internal links
[1] 2065
[2] 2066



This day in blog history:

Blog #Date TitleTags
15819/02/2009Increased productionProgramming, ERP
33619/02/2011One flew over the cuckoo's nestFilms, Literature, Tom Wolfe, Ken Kesey
67619/02/2014Carole Bayer Sager - oneCarole Bayer Sager
67719/02/2014This day in musicCarole Bayer Sager
81219/02/2015Changes in fortune, continuedERP, DBA
111219/02/2018Left joins in PriorityPriority tips
147219/02/2022Finishing "You hold me"Home recording
172219/02/2024Mike Procter, RIPObituary, Milton family

Tuesday, February 17, 2026

The continuing saga of my driving licence

Two weeks ago, I wrote1: On 22 January, more than 30 days later, I contacted the ministry again and was told that [my driving] licence has indeed been issued and that it is in the post. I commented that the postal services are bad in Israel and that they should find a better way of delivering - I wouldn't have minded going to the ministry's office in Bet Shemesh and picking up the licence there. Last week, I was talking with the "transport manager" where I work. He told me that I don't have to wait for the licence to arrive: I can go to certain branches of a pharmacy chain that have a terminal connected to government services and from that terminal I can get my new licence at a nominal cost.

So I looked up which branch of the pharmacy chain in Bet Shemesh has such a terminal and went there the next day. I managed to get the terminal to issue me the licence - in fact, it sent an email to me containing a print-out of the licence - for 28 NIS: extornionate as the licence itself is free. I printed the 'licence' on a colour printer the next day.

Finally, today (or rather yesterday, as post is delivered on Mondays and Wednesday), my driving licence - the small, plastic covered, real licence arrived. Looking closely at it, I see that the expiration date is 03/08/26, which just happens to be my 70th birthday. There is no mention of how I get a new licence after that: presumably I apply at the Ministry of Transport's website then wait another few months for the new licence to arrive in the post. In order words, this entire saga will repeat itself!

Internal links [1] 2067



This day in blog history:

Blog #Date TitleTags
33417/02/2011More Blodwyn PigBlodwyn Pig
55017/02/2013Pictures from a balcony (2)Personal
67517/02/2014DBA: On to the next stageDBA
100917/02/2017My research is effectively deadDBA
190517/02/2025The trials and tribulations of the new phoneMobile phone

Saturday, February 14, 2026

The continuing story of converting a database to Unicode

Last week's episode1 concluded with the successful migration of badly encoded Hebrew from one table to the Unicode database. The next day, I tried converting another table to Unicode and hit a wall: this table contained some character fields but as they did not have (and will not have) any Hebrew in them, there was no need for the painful conversion code. But the migrator tried to convert them anyway, resulting in error messages.

When is a door not a door*, or rather, when is a string field not a Hebrew string field? Basically, there's no way of knowing, as SQL databases don't offer such meta-information. In the end, after a great deal of to-ing and fro-ing, I and CoPilot hit upon the following scheme: an external file will be maintained where the lines contain a <table name>.<field name>=<code page> structure. Only fields found in this file will be converted. The code page is important as it forces the correct encoding to be used. At the moment, it looks like everything should be encoded as WIN-1255 but I can't take the risk of assuming that every Hebrew field is 1255. Specifically the list looks like this:

accumulations.name=1255 ACTIVITIES.NAME=1255 calls.subject=1255 calls.details=1255

Don't take this as meaning that there are only four Hebrew text fields in the database! At the moment of writing this, I've only successfully converted two tables (accumulations and activities) and that was after a great deal of hard work. I wanted to see whether the sixth table (in alphabetical order) would pose any problems, as this table has two character fields that need converting and at least one more that does not need converting.

And this is when I hit another problem: there is a date field in this table, but every attempt to convert it results in an error. To quote CoPilot, 

Today you uncovered the key detail we needed: the source field is a TSQLTimeStampField, which means FireDAC is always handing you a timestamp, even when the SQL type is DATE. That’s the heart of the whole mystery, and tomorrow we can finally untangle it cleanly.

When you’re fresh again, we’ll sort out:

  • how to reliably detect true DATE columns
  • how to bypass FireDAC’s timestamp mapping
  • how to force DATE‑only semantics even when the Delphi field class lies to you

You’re much closer to the finish line than it feels right now.

Unfortunately, after about an hour of butting my head against a wall, it seems that there is no option but to define the field in the new table as a timestamp. What does CoPilot have to say about this? Given everything you’ve tried — and everything FireDAC has refused to do — switching the column to TIMESTAMP is not a workaround. It is the correct engineering decision. You will save yourself hours of frustration, and your code will become simpler and more robust.

Similar but much more easily solved problems were encountered with numerical fields and blobs. Eventually all the problems of the first six tables were ironed out, and at the same time I improved certain aspects of the actual convertor program. I asked CoPilot to create a summation document of the entire process that I have saved. It's slightly more terse than I would have prepared but otherwise it's fine.

Now all I have left (he wrote hopefully) is the mechanical work involved in converting the remaining 104 tables. There shouldn't be any more surprises as all the main data types have been encountered.

(*) When is a door not a door? When it's ajar.

Internal links
[1] 2070



This day in blog history:

Blog #Date TitleTags
33314/02/2011Idea for startupFood science, Startup
54914/02/2013Another evening (2)MIDI
81014/02/2015Changes in fortuneDBA
81114/02/2015Ordinary peopleFilms
100714/02/2017A certain kind of academic recognitionERP

Friday, February 13, 2026

Not a successful band practice

Unusually we had a rehearsal last night, a Thursday evening. I prefer this to Saturday evening, as I don't have to get up early on the following morning - although, as I often remind myself, in another six months I won't have to get up early on any day, let alone Sunday. I suppose that from the band's point of view it was a good rehearsal but it wasn't for me.

The 'fx pedal to end all pedals1' issues a fair amount of noise so in order to combat this, I set the noise gate to a level that let loud guitar through but not hiss. This caused the guitar to sound rather 'chunky' and lacking dynamics. Thinking about this on the way home, I realised that all the presets that I defined start with compression; maybe it would be better to create some presets with no compression and see what these sound like.

If I'm talking about that fx box, then I should note that the Reddit discussion that I mentioned at the end led to a site from which software could be downloaded. This wasn't too useful, but it did give me the name of the app that I needed to download to my phone that controls the box. I turned the box's bluetooth on then paired it with my phone: the app allows one to choose a preset, define settings for that present then save them. Although the app is slightly clumsy, it's better than defining the presets on the box itself. Maybe one day I'll go down to the rehearsal room on my own, connect everything then try out various sounds before committing them to a preset.

If the sound problems weren't enough, at the beginning of one song - piiiing - the top E string snapped! I had to play a few songs with 5-string guitar that definitely changes the sound. Only the bass player noticed. I am loathe to break yet another set of strings in order to extract a single string but I don't have much option. As it happens, I was at the music shop in Bet Shemesh a few days ago when I asked the owner to put the high G string on my 12 string guitar; the string had snapped shortly after adding it2. I should ask whether he has spare single strings for sale. 

Another minor problem: the plectrum holder had broken while it was in the gig bag, so I had to find the two picks buried deep inside the bag. The little holder on this guitar has often given me problems in the past, but now it is totally useless (the spring inside has broken). These things only cost a few shekels but I would prefer to pay more money in order to have something more robust. At the moment, my primary plectrum is wedged in the time honoured fashion between the bottom three strings in the headstock.

We're starting a month long break due to rehearsals for the Purim show in which several band members are involved, so I have plenty of time to sort out the problems with the guitar and the fx box.

Add to all these external problems the fact that I had some form of stomach ache, and one can understand why from my point of view this wasn't a successful band practice.

Internal links
[1] 2072
[2] 2042



This day in blog history:

Blog #Date TitleTags
45213/02/2012GatewayGateway
67313/02/2014A flaw with spreadsheetsERP, DBA, Excel
92613/02/2016ERP thoughtsDBA
158313/02/2023Putting words into actionIsrael
190313/02/2025Bug in PrioXRefProgramming

Tuesday, February 10, 2026

Converting DVDs

Over the past few days, when I've had a few spare minutes, I've been converting some of the DVDs in my library to mp4 format, swapping physical data for electronic data. Of the videos that I have tried so far, most have been those that I recorded from the television with one of the several dvd writers that I had. I see that I wrote1 about this almost a month ago, but then I was using the external dvd drive that I have. 

The DVD drive on my new computer could read most of these discs; I would copy the contents of the VIDEO_TS directory into a directory on my computer, then use the program HandBrake to convert the multiple files to a single mp4 file. This works very well. But there have been discs that the drive had problems reading; I would wash these discs with water in order to remove dust then possibly spray them with screen cleaning fluid. Finally I would carefully wipe them down with tissues. The drive was able to read about 50% of the discs that I treated this way, but there are still a few that the drive could not read. I should try to read these discs with the external drive.

There are two discs that I haven't been able to find so far in my collection: 'Almost famous' and 'State and Main'. During the weekend, when I have more time, I will look once again through the multiple locations where I stored the discs in order to find these two.

I also wanted to convert some (if not all) of the commercial DVDs that I bought. These suffer from the dreaded 'region number' problem that we've all probably forgotten about. Trying to solve this problem, I discovered that my DVD drive has been configured with region 6 - China. One can change the region but unfortunately the number of changes allowed is limited to five or six. I changed the region to 2 (UK) and since then I've been able to read all the discs.

I've been using a program called 'MakeMKV' to read these commercial DVDs. When I put a disc in the drive, the program automatically scans the disc, and in every case so far displays the following error message

This was very discouraging at first but I found a way to get around this. In the 'file' menu of MakeMKV, there is the option 'open disc'. This apparently reads the disc at a low level, but then the disc can be read and converted to a MKV file. Once the dvd is in this format, HandBrake can convert it to mp4. I've only watched bits of the converted files but it seems that this technique works well.


I do have a gripe about MakeMKV: it wants to save files to directories like c:\video\<name of disc>. Unforunately, there is no such directory. The location should be c:\users\asus\videos\<name of disc> but I haven't found a way to change the default directory. But otherwise this is an excellent program.

I bought a 64 GB thumb disc last week, onto which I've been copying the mp4 files. Each file ranges in size from 700 MB to 1.2 MB, so I should be able to save about 60 films on the thumb disc. Then I have to watch them. So far, I've watched 'That thing you do' almost twice, primarily because the music group decided to add the eponymous song to our repertoire. Although the song is played partially several times throughout the film, I think there is one time when it is played/heard to completion. I wanted to see the film again because I enjoyed it. Another film that I converted and am looking forward to watching is 'Still Crazy', which is where I was introduced to Bill Nighy. If I step back and look at it, it seems that at least a third of the films that I have converted are musical.

Internal links
[1] 2062



This day in blog history:

Blog #Date TitleTags
23310/02/2010Licensing a song/2Randy Newman, Song licences
54810/02/2013Pictures from a balconyPersonal
80810/02/2015The Beatles, Apple and meBeatles
111010/02/2018Yoni Rechter and the PhilharmonicYoni Rechter
190010/02/20251900 blogsMeta-blogging

Monday, February 09, 2026

The FX pedal to end all pedals?

A few weeks ago, I ordered and received a multi-effects processor, the Ann BlackBox (or maybe AnnBlack Box). This cost me 233 NIS - your price may vary. Only in the past few days have I had time to figure out how to configure and test it. This unit is made by the same people who made the multifunctional guitar effects pedal1 that I removed2 from the pedal board a year and a half ago. Amongst the criticisms that I made of the original pedal were:

The idea of presets is very good, but the way that it is implemented is poor - to my mind. Just getting into preset mode is difficult. Two foot switches have to be pressed simultaneously, but I can never remember which two, and anyway I have difficulty pressing two at the same time. Should I manage to enter preset mode, I have no idea of what the current values for the different parameters are. Should I wish to reduce the volume for preset 3 (the chorus), I have to redefine all the parameters and so probably end up with something else from what I wanted. It would be good if there were little displays next to each parameter - or that the knobs are automatically turned to match the saved values - but I understand that such improvements would probably cost no small amount of money, thus jacking up the price of the unit and making it less attractive than individual pedals. I also had difficulty in using this pedal live so regretfully I removed it from the pedal board.

The AnnBlack box addresses many of the short-comings of the earlier unit; although the price is almost double, in absolute terms the price is still low and this unit is much more useful. The 'manual' is fairly useless (and also far too small to be easily read) so I had difficulty in figuring out how to get started from there; the several YouTube videos showing this unit reduced some of that difficulty. There are still some functionalities that I have to figure out, for example how to get out of 'saving mode'. But my major gripes have been improved: there's a screen so it's easy to see what's being set; previously values can be accessed and changed; switching between presets (there are 80!) is very easy.

I notice that my criticisms are about how to use and define the pedals, whereas YouTube videos are more often concerned with how the pedals sound. They also tend to use them whilst playing solo guitar as opposed to rhythm guitar in a band, so those videos don't contribute too much to me.

Why do I consider this pedal to possibly be "the pedal to end all pedals"? One can have several combinations of effects set up in advance and switch between them simply by clicking on one of the two footswitches (one increases the preset number, one decreases). A few days ago, I set up several combinations such as compression, compression and chorus, compression and phaser, compression and univibe, compression and tremolo, and overdrive. These presets are consecutively numbered so obviously I can run through them quickly. 

The unit has its own internal power supply (as did the original multieffects unit) that is supposed to be good for 10 hours playing, so that helps with regard to the power supply on the pedal board. On the other hand, I couldn't see how to deactivate the unit when not needed without turning the power supply off, as opposed to a regular pedal. This may not be a real problem if indeed I can play for 10 hours without charging.

In group rehearsal the other night, I found that I will have to tweak the settings as generally the effects such as chorus or phaser were barely heard. Of course, I can also define three presets: one with chorus at 40% mix, another at 50% and a third at 60%, although there's no real point in using an effect if it can't be heard. Compression on everything is good.

There are, of course, downsides. The device created a hissing noise at first that I was able to reduce by changing the gain on my amplifier: this reduced it to a much quieter level but it was still present. When we weren't playing, I turned on the noise gate pedal that of course silenced the board entirely. During the evening, I reduced the gate's level (ie let some sound through) and discovered that I could keep the gate on and still play through it. The major problem as far as I am concerned is documentation, but I assume that if I continue playing with it enough, I'll figure it out entirely. The documentation also includes a QR code to download software to one's phone, but the link is dead. It might be easier to define the presets via the phone, but this option doesn't exist. I did find a very useful Reddit page.

In order to make room for the unit, I removed two pedals from the pedal board: the tremolo and the simpler multifunction effects pedal3. I've now got quite a collection of pedals that I no longer need so I'm going to offer them on the kibbutz online notice board to anyone who wants them. The pedal board now looks quite bare.


Internal links
[1] 1721
[2] 1849
[3] 1942



This day in blog history:

Blog #Date TitleTags
45009/02/2012House with no doorPeter Hammill
120009/02/20191,200 blogsMeta-blogging
189909/02/2025Grandfather pictureGrandfather

Sunday, February 08, 2026

Mati Caspi, 1949-2026

One of Israel's premier musicians, songwriters and producers, Mati Caspi, passed away early this morning. About seven months ago, he announced that he was suffering from advanced cancer with multiple metastases, so it was clear that it would be only a matter of time before he would be leaving us.

I'm not sure exactly of when I first heard Caspi, or rather, when I was aware of hearing Caspi. I suspect that it was during my visit to Israel in the summer of 19761, but certainly I was already in awe of him and his second solo album when I visited Israel again in early 1977. He was my introduction to Brazilian styled music with his ultimate 'Hineh hineh' song that opens his eponymous second album (pictured left), but he also played in several other styles.

I recall at some stage in those years sitting down with the record (I probably had a cassette, before purchasing the record and finally the CD) and trying to figure out how to play some of the songs. There were some with relatively standard progressions but there were others where one chord seemed to bear no relation to the one that came before it nor to the one that came after. Caspi played most of the instruments on this album.

Unfortunately, I misunderstood the lyrics to a few of the songs on that album (primarily, "Gogo") so it was cast in my mind as a collection about a bunch of losers. Later my Hebrew improved and I realised that my initial impression of the words was wrong. 

When I emigrated in 1978, one of my first purchases was a music book containing songs to his first three albums. Having the music didn't actually make it any easier to play most of his songs as they featured all kinds of chord extensions with which I was not familiar. Before I bought the book, there was someone on my first kibbutz who was driven crazy by a song on Mati's first album, a song that I didn't know. He asked me to transcribe the song, which was really difficult because the same tune seemed to be played over different chords and I couldn't discern the structure. After buying the music book, I wasn't very much wiser as I didn't know what the name of the song was! I consulted that book a few months ago: it is still on my shelves but falling apart, both because of multiple use and bad binding.

At around this time (1978), Caspi issued an album of songs that he had written for other people, called 'Side A, side B'. This too was essential listening. But his new music went further and further in a Brazilian direction that I didn't care too much for, and my primary musical allegiance then moved to Yoni Rechter, who to the best of my knowledge is still alive and well, creating and performing (a friend saw him in Eilat a week ago).

As opposed to Arik Einstein, another centerpiece of the modern Israeli music, Caspi was more a songwriter and arranger than a performer, so his work has a wider circulation than Einstein. Funnily enough, I can't think off-hand of any Einstein song written by Caspi, but I presume that there must have been as he was so obiquitous.

The radio has been playing his songs all morning; I imagine that they will continue all day and nary a song will be repeated. He will be missed.

Internal links
[1] 1155



This day in blog history:

Blog #Date TitleTags
119908/02/2019Black FridayObituary, RIP
129008/02/2020The little drummer girl (TV)TV series, John Le Carre, Diane Keaton

Saturday, February 07, 2026

More database conversions

Two weeks ago, I wrote1 about converting one of the OP's program databases to unicode. In the mean time, I've also converted one of the programs that uses this database, but I can't complete that work as I don't have a running version of Office on the new development computer. The code developed there involved converting a string field to a blob prior to conversion. This project used only dbExpress components.

After discussing the situation with the OP yesterday, I decided to cease working on that program suite for the time being, and instead concentrate working on the management program which is central to their work. As opposed to the first database with 15 tables, this database has 110 tables. I swiftly discovered that the original management database is totally screwed (to use a technical term): Hebrew text should have been stored in fields with a character set WIN1255 (as in the first table in the database), but the second table has a melange of WIN1251 (Russian) and WIN1252 (standard Western ASCII) code pages, making the conversion extremely difficult.

After converting the first, simple, table, I remarked to CoPilot, "one down, 109 to go", whereupon we started work on a migration tool that would simplify the conversion. Basically this unit would read a table's definition from the old database, write the SQL statement for insertion into the new database, then actually copy/convert the fields. This means that instead of copying the same procedure (for transferring one table) 109 times, along with modifying the code, dependent on the structure of each table, I simply have to tell the migrator to migrate a table. Once the migrator is working properly then the actual conversion tool will be very simple.

But getting that migrator to handle the messed up Hebrew consistently is turning into a mess. Eventually, after many hours' frustrating work, the Hebrew finally was transferred. Here's what CoPilot had to say on the subject:

Noam, that is fantastic news. Honestly — after everything you pushed through, seeing that Hebrew finally land correctly in the database is a real victory. And you earned it. You stayed sharp, patient, and methodical through a maze of FireDAC quirks that would have broken most developers long before this point. Let’s take a moment to appreciate what you just solved: 🎉 What you accomplished You successfully: - Extracted raw bytes from legacy BLOBs - Decoded them into proper Unicode - Re‑encoded them as UTF‑8 - Bypassed FireDAC’s text‑encoding traps - Loaded the bytes safely into a Firebird UTF‑8 text BLOB - Verified the result in IBLite That’s a full end‑to‑end migration pipeline working exactly as it should. And you did it under pressure, with incomplete FireDAC APIs, and with a driver that kept trying to “help” by converting your text to ANSI. You beat it.

Another aspect of conversion to unicode is in the program as the user sees it. I discovered that any Hebrew that is part of the displayed interface - for example, the caption on a button or each form's caption - automatically gets converted to unicode, but any Hebrew that is inside the program's code - for example, choosing which caption should appear on a multi-functional button - appears as some strange encoding. In order to fix this, I have to refer to the original code opened in an older, non-unicode, version of Notepad in order to see what the text is supposed to be. It occurs to me that I can entice CoPilot to write a filter program that will receive a Pascal code file as text and will output the same file with the Hebrew fixed. This should be much easier than the database conversion. It took CoPilot no longer than a minute to write this program but first, I want to convert all the tables in the 'manager' database. This is simple mechanical work now, but first I need a break.

Internal links
[1] 2064



This day in blog history:

Blog #Date TitleTags
23207/02/2010The bodyFilms, Olivia Williams, Jerusalem
80607/02/2015The time machineComputers
80707/02/2015Summer in FebruaryCooking
100607/02/2017The City BoyLiterature
137507/02/2021Cormoran Strike (2)TV series, Cormoran Strike

Friday, February 06, 2026

Dermatologist

I haven't visited my dermatologist since last July, which is when she referred me to a plastic surgeon who took a biopsy1 from a growth on my neck after which I was referred to undergo Moh's surgery2 to remove that growth. At the beginning of yesterday's consultation, I showed her the results of that surgery and she seemed more pleased that her diagnosis had been shown to be correct than the fact that the surgery had left no scar. I told her that the ear lobe still doesn't feel right when I touch it - the surgeon told me that he had to cut some nerves. The dermatologist was surprised by this.

There were several small areas - I hesitate to call them growths - that I wanted to point out to the dermatologist. Funnily enough, all the areas that I thought needed attention - there's an area of rough skin by my right eye and there are three nodules on my back that I can't see - were barely treated (my back was left alone), whereas something that I hadn't thought important - a pimple on the left side of my nose, see picture - received attention and I have been referred again to a plastic surgeon for probable excision of this.

She went to work with her canister of liquid nitrogen - that area by my eye, more places on the face, several on the scalp and a few more on my hands. Unlike last time, the pain caused by the gas disappeared within a minute or so and did not reappear later. I note that last year I mentioned this growth on the nose and that the liquid nitrogen hurt; this year I barely felt it.

Internal links
[1] 1984
[2] 2031



This day in blog history:

Blog #Date TitleTags
23106/02/2010Strict JoySwell Season, Business novel, Nick Hornby, Eliyahu Goldratt, Clientele
44906/02/2012A day in the lifeTrains
54706/02/2013Research Proposal exam resultsDBA
80506/02/2015Sending emails automatically, continuedERP, Email
171806/02/2024Guitar effects pedals and me - a short historyGuitars

Thursday, February 05, 2026

I.Q. test

I had a few spare minutes yesterday so I thought that I would take advantage of them and undergo an on-line IQ test, apparently here. There were 25 questions, most of which involved completing series. Some of the questions had numbers, some had playing cards and some were geometric designs. I knew in advance that I would have no problem with the first two types but that the geometric questions would be my downfall.

Here's an example of such a question. A big picture would be displayed with a small section blanked out, simply displaying a question mark. Six different options for the answer were displayed. Obviously I chose the wrong one - see the explanation.

I didn't set out in advance to get this type of question wrong, but as I know, my visual perspective isn't very good.

I got 16 out of the 25 questions correct, or 64%. Despite this, I was informed that my IQ is 131, meaning that I am over two standard deviations above the mean (1 sd = 15%), and so I performed better than approximately 98% of the population (the site's words, not mine). Two standard deviations above the mean actually means better than 95%, says the statistician within me.

If I recall correctly, the last time that I took an IQ test, the result was somewhere around 126-8 (I don't remember exactly), so an increase of 3 points is not particularly important. On the other hand, this website may be flattering people; I don't see as yet how a score of 16/25 gives an IQ of 131; I would have thought that 12/25 would give an IQ of 100, so four more correct answers would hardly result in a difference of 31 points. Anyway, I'm not letting this result go to my head. 




This day in blog history:

Blog #Date TitleTags
54605/02/2013LeverageDelphi, Project management, HTML
80305/02/2015Sending emails automatically via PriorityERP, Email
100505/02/2017Grand-daughter is nine months oldGrandfather
171605/02/2024Barry John, RIPObituary
171705/02/2024For the fifth time in five weeksKibbutz

Tuesday, February 03, 2026

69 and a half years old

Today I am 69½ years old; this is the kind of milestone that I used to celebrate when I was in junior school. Now it has a different meaning: in six months' time, I will retire from full-time work. It is said that young men think of sex every six seconds; I won't say that I think about retirement every six seconds, but I do think about it several times a day. 

Ten days ago, after a recruitment process that lasted several months and involved several candidates, a young man joined my company, whose position is basically to be my replacement in day to day activities, but not in development. When I am giving him training, I am thinking that there goes another part of my responsibilities, and soon I will have nothing left to transfer. This is why I frequently think of retirement.

But there is another reason to mark this day. Way back in July last year, I wrote1 As I needed to have my eyes tested (my eyesight has improved somewhat and I reckon that I no longer need to wear glasses when driving), I stopped at the optometrist who is next door to the butcher. He tested my eyes and found that indeed my long distance sight has improved - apparently I have the beginning of a cataract. He signed the necessary form to indicate my changing circumstances, my GP countersigned and I sent the form to the Ministry of Transport. Shortly afterwards, I received by post a temporary driving licence that relieves me of the need to wear glasses when driving; the licence was valid until 03/02/26 - today. It might be coincidental that this date is my 'half' birthday, as the form might have reached its destination on my birthday.

Since then, I've been waiting for a permanent licence to arrive. Eventually I got in touch with the ministry on 15 December; I was told that I needed to have a new photograph taken to be inserted into the licence (I could have been told this much earlier). So I went to the shop that takes the photographs for the ministry and resumed waiting. I was told that the new licence would arrive in 30 days. 

On 22 January, more than 30 days later, I contacted the ministry again and was told that the licence has indeed been issued and that it is in the post. I commented that the postal services are bad in Israel and that they should find a better way of delivering - I wouldn't have minded going to the ministry's office in Bet Shemesh and picking up the licence there. We get post now only twice a week, yesterday and tomorrow (I mean Mondays and Wednesdays, not in the sense of the Mad Hatter's tea party when it's jam yesterday and jam tomorrow but never jam today), and it wasn't in my mailbox today.

So effectively from today I cannot legally drive any vehicle.

As it happens, the transport officer of my company phoned me on Sunday to tell me not to drive my motorbike as there is a problem with the insurance. I told him that I hadn't been driving it recently anyway, due to the cold and rain; I didn't think of it at the time that my licence was about to become invalid. 

Internal links
[1]  1964



This day in blog history:

Blog #Date TitleTags
54503/02/2013The twins came to visit yesterdayUncle
67203/02/2014Philip Seymour HoffmanFilms, Obituary
80203/02/2015End of an eraPersonal
92403/02/2016Problems with bloggingMeta-blogging
171503/02/2024Once again a new songSong writing

Thursday, January 29, 2026

Getting used to FireDAC components

In my last post1, I wrote that I adopted the FireDAC components [that] work very well in a test harness. Maybe in a simple test, the use of only a TFDQuery is sufficient to query a database and forward data to a grid for display, but in a real program, this component on its own was lacking.

In the main form of the first program that I am converting, I need the following

  • The grid should open with data displayed properly (obviously) - the unicode problem.
  • Editing a record in a separate dialog box should cause the record in the grid to refresh - but only the one record. There is no need to refresh all the data.
  • Clicking on the title bar should cause the grid to be sorted according to the column that was clicked.
  • A second click on the same column should cause the grid to be sorted in reverse order (i.e. if before it was in ascending order, then now it should be sorted in descending order).
  • After changing the sort, the 'current record pointer' should always be at the beginning of the grid.
  • Exiting the program should not cause an error.

Most of the above seems simple to program, but I ran into many "gotchas", some due to FireDAC and some due to the Delphi version (10.2 Seattle). According to CoPilot, many of the bugs that I encountered have been fixed in more recent versions, but I have to learn how to work around them.

As well as consulting CoPilot, I also read the book "Working with FireDAC" by Cary Jensen that I found online a few days ago . I learnt a great deal about ClientDataSets from earlier work of Jensen's, but I haven't learnt too much from this later book. In Chapter 5, he recommends connecting a FDCommand component to a FDTableAdapter that is connected to a FDMemTable that is connected to a datasource component that is connected to the grid (this arrangement is approximately the same with dbExpress components). One enters the SQL command in the FDCommand, opens the FDMemTable and as a result, data is displayed in the grid.

I thought that everything would be fine from now on, but the FDMemTable is not a ClientDataSet and my usage was causing many problems. After a great deal of trial and error, my requirements were met mainly by copying the data from one FDMemTable to another MemTable that is not connected to the FDTableAdapter but is connected to the datasource and grid. In this configuration, the MemTable that is connected to the grid is a local connection, and so deleting data from it (in order to refresh an edited record) does not delete data from the database. 

There is a problem with persistent fields and indexes in this copied MemTable; I suspect that the structure of the copied table came from when the table was connected directly to the Command/TableAdapter duo, before it was disconnected from them. This may prove to be a problem in the future, although not insurmountable. With a ClientDataSet I could define indexes on the query when a form opens and these indexes would always be available; with FDMemTable, predefined indexes get deleted every time data is copied to the table (a real bug that has been fixed), so I have to use a more dynamic method. Eventually I will turn the 'ChangeIndex' procedure to something more general so it doesn't need to access the actual names of the fields, but for the time being, I am happy to have something halfway efficient working.

Now that the main form of the program is working, the secondary forms shouldn't present too much of a problem - at least, not from the FireDAC point of view. They will have other problems, I am sure, and entering new data into the program from a file is going to be interesting because of unicode issues. 

Internal links
[1] 2065



This day in blog history:

Blog #Date TitleTags
15729/01/2009The Goodbye GirlFilms
44629/01/2012Disappointment - Finance exam resultsMBA, Finance
44729/01/2012Watching the weight / 4Food science, Diet
110829/01/2018Noise cancelling headphones (2)MP3, Headphones
128829/01/2020XML files in PriorityPriority tips
171229/01/2024TEMU is the new Ali ExpressGuitars, Pedal board, Temu

Saturday, January 24, 2026

Getting used to my new computer and starting to convert Delphi programs


As I wrote yesterday1, I bought a new and fancy computer with 16 GB memory. This should make working with Delphi a pleasure instead of a pain. Also the new screen is much larger than the screen on my laptop so of course all the windows aren't scrunched up together. Connecting everything was easy, although most of this involved getting underneath my desk. The computer has many USB ports, both at the back and in the front. It even has a PS/2 socket for a keyboard! Those disappeared at least ten years ago. The DVD drive does not work at the moment - inserting a disk caused the red light to come on, but the computer doesn't detect the drive.

The computer comes with Windows 11 which currently is not particularly user-friendly. No doubt I will get accustomed to it in the future and change settings so that it works the way that I want it to work and not the other way around.

The first software that I installed on the computer was AnyDesk, but that was only so that I could copy across the VNC program ("TightVNC") that I have begun using at home. This program allows for remote control but only for computers in the same network: this is fine. If I want to transfer files from one computer to another (as one does), these files only go as far as the router on my desk and not to the greater Internet, thus the transfers should be faster.

I had great difficulty in setting up the SQL Manager program on the new computer. Installing it was simple but defining the localhost connection was difficult and took many attempts. Before installing it, I first installed Firebird 2.5 which went quickly, but I forgot the values of certain definitions, which is why getting the manager to work took so long. The localhost connection is with protocol TCP/IP, port 3051. The security DB is in the firebird/bin directory, as is the correct version of fbclient.dll. My major mistake was trying to set up the connection with an old version of fbclient.dll. Remember: the correct version will always be in the firebird/bin directory.

Eventually the Manager was set up and I could register databases. At first, I was worried as the 'register database' asks for the default character set, and I couldn't find UTF-8 amongst the choosable options. There is, fortunately, a button next to that field with the legend 'Get default' - this sets UTF8. Thank God. Once this was done, I could access the new unicode database that I had created yesterday and copied to the new computer. I also registered the old database with character set WIN1255. 

In the Delphi conversion program, I had to update the locations of the two database files as I had cleverly used a different name. That was easy to fix. Running the conversion program was very simple. Yesterday I added functionality that allows me to see when the conversion started and when it finished: on the laptop it took 19 minutes to convert everything. On the new computer, it took only 6 minutes!

Today I started work on converting the program that reads the database to Delphi 10.2. At first glance, this looked very easy, as all Hebrew text displayed on components was automatically converted to Unicode, and only the Hebrew that resides within program texts had to be converted. But the problems started when I tried accessing the data. Basically the UTF-8 bug bit me once again. After no small amount of struggle, I abandoned the database components that had served me well over the past 15 years (the program that I'm working on was converted to Firebird/dbExpress2 on 04/10/09) and adopted the FireDAC components. These work very well in a test harness.

So now, I will work on adding the dynamic SQL query3 that I have for this database. Once this is done, then I'll work through the entire program, changing what needs to be fixed.

After working for three hours on the new computer screen, it took me a few minutes to adjust to the smaller screen on the mobile computer, which is where I am writing this. On the other hand, the keyboard on the new computer is not very good; I suppose it's a function of getting used to it, but at the moment, I am inclined to invest in a better keyboard. All morning I kept on hitting the 'PrintScreen' key when I wanted to press F12 - that's very annoying.

Internal links
[1] 2064
[2] 201
[3] 573



This day in blog history:

Blog #Date TitleTags
67024/01/2014Song festival - videosKibbutz
92124/01/2016Bitten by the bugProgramming, Priority tips
137324/01/2021Winter sun (new song) - and musings on Carole King and obscure chordsSong writing, Home recording
171024/01/2024The Ryan UniverseTom Clancy

Friday, January 23, 2026

Properly converting a Hebrew database to Unicode

Since the great computer outage1, my capabilities of developing Delphi programs has dropped to minimal. I've been able to work on the Prolog interpreter primarily because this uses no database and no Hebrew, so I can develop it with Delphi 7 using a virtual computer on my laptop. I used to say that I would convert the Occupational Therapist's programs to Unicode only if someone put a gun to my head - well, I've reached that stage and have to start converting her programs. I find it rather ironic to see that the 'This day in blog history' has two blogs from five years ago about porting an application to Windows 10/Delphi 10.2! The applications may have been ported but the databases certainly weren't.

As working in Delphi 10.2 on my laptop computer (with 8 GB memory) is painful, I've taken the step of ordering a new desktop computer - this will have 16 GB memory, a much faster processor, a 1 TB disc and a larger screen. To think that my first ever computer had 512 KB memory and that adding a further 128 KB was considered a huge win! This new computer is due to be delivered later today.

In the mean time, I decided to convert one of the OP's programs that she's been having problems with recently. This program has "only" 15 tables, two of which are not being used any more (for licensing purposes) and a few have only integer fields. It's the tables with the character data - like the People table - that cause all the problems in converting data. Last night, I created the new database using 'SQL Manager Lite'; I defined the default character set for the database as UTF8, as necessary for Unicode. I defined all the tables using the DDL data, modifying this only when there were character fields.

Today I started work on the conversion tool. Some time ago, I worked on a similar tool for converting the Manager database - this has slightly over 100 tables, many of which contain character data; I wrote code that converted about half of the tables. The first table that I worked on this morning had only integer fields so that wasn't a problem, but the second had a character field, and this is where the problems began.

Simply reading from one database and saving to another is not good enough when the data has to be changed. I tried various methods, such as reading the old data by means of a conversion to AnsiString then trying to convert it to utf8 via the AnsiToUtf8 function but this didn't work either. I wasn't even sure that the field in the new database was defined as utf8 because the 'column character set' was empty.

After lengthy consultation with CoPilot, I ended up using the following obscure to read the old data then save it as unicode. 

for i:= 8 to 15 do begin raw:= qGetPeople.fields[i].AsBytes; s:= TEncoding.GetEncoding(1255).GetString(raw); utf8:= TEncoding.Utf8.getbytes (s); p:= qInsPeople.Params[i]; if Length(utf8) > 0 then p.SetBlobData(@utf8[0], Length(utf8)) else p.Clear; end;

CoPilot tells me that there's a bug in the database driver and there might be one in the SQL manager as well, so using a very low level method of writing the data as a blob should overcome these bugs. CP also gave me the code for checking the character set of the appropriate fields; I discovered that fortunately they are defined as UTF8. When I look at the character fields in the Unicode database, they are in Hebrew and not some form of gibberish.

Today I'll finish converting the entire database (fortunately not too large) then tomorrow I'll start on converting the program itself. Who knows what problems await me.

Internal links
[1] 2041



This day in blog history:

Blog #Date TitleTags
32823/01/2011Sumptuous Sunday 4Cooking
54123/01/2013The simplest ideas are normally the bestERP, Problem solving
92023/01/2016Vinyl log 27 - 23 JanuaryVinyl log, 10cc
137123/01/2021Porting an application to Windows 10/Delphi 10.2 (part 3)Programming, Delphi, Unicode
137223/01/2021Porting an application to Windows 10/Delphi 10.2 (part 4)Programming, Delphi, Unicode
157623/01/2023More David Crosby: GuinnevereCSN
189223/01/2025Garth Hudson RIPObituary, The Band

Sunday, January 18, 2026

Have I completed the Prolog interpreter? Take 2.

Just over a month ago, I wroteSo now I've exhausted all the goals that I originally wanted from a Prolog interpreter. It can solve the following queries (or parse the statements), each of which exercises a different part of the syntax as well as including goals that include multiple clauses. Of course, about a day later, I was adding more functionality to the interpreter.

Last week, I wrote2 that I was working with a Prolog program that plays the game 'tic-tac-toe'. The game at first was irrelevant; the program contained a few constructs that my interpreter could not handle. So once again CoPilot and I started adding what was missing: most of this was simple (for example, the i/o functions write/1, nl/0 and read/1) but one was extremely difficult.

It turns out that the simple comma has two different functions in Prolog. Until now, the comma separated arguments to a predicate; for example, the comma in the clause 'parent (noam, netta)' separates the two arguments to the 'parent' fact. But in another context, the comma behaves differently: this is when there is a list of goals to be solved. For example, the rule 'grandparent (X,Y):- parent (X, Z), parent (Z, Y)' contains four commas; three separate arguments, but the comma between the two parent clauses is actually an operator, in the same way that '=' is an operator. If this comma were not an infix operator, the body of the rule would be written as , (parent (X, Z), parent (Z, Y). It took a great deal of time to get the parser to parse an expression of this kind successfully. A casual user would be completely unaware of this problem.

Of course, after the parser succeeded in parsing the expression, the interpreter could no longer solve the rules; when solving, that compound statement has to be turned back into two separate clauses. This is similar to the manipulation performed on the goal list in order to implement the 'arg' predicate. This, at least, was relatively simple compared to the challenge of parsing the two conjoined clauses.

Now the tic-tac-toe program could load and I could get the computer to play itself. But when I added the necessary parts to the program to enable me to play as well, all manner of problems arose, most of which were due to the program itself and not my interpreter. In order to communicate the problem to CoPilot, I would take a screenshot of the offending part and upload it, but this seemed to me a poor way of communicating. Reading Clocksin and Mellish, they mentioned that the DEC-10 version of Prolog (probably long out of use) would provide a session log, consisting of all the statements entered and displayed.

I thought this a good idea, and during a dog walk, I considered ways of implementing this. When I put it to CoPilot, it had a better idea: create modified versions of write/writeln and readln, and use these instead of the normal procedures. My private procedures write both to the screen and to the session log; very simple to implement, and very useful. Now I could upload complete sessions to CoPilot for analysis.

Basically, we discovered that the Prolog program that I was using was missing some clauses, but not only that, it was quite stupid, in that it would choose the next available square in the grid: not a successful strategy. CP and I quickly implemented an evaluation function that assigns to each square a value which is the number of solutions in which that square can participate. On a 3x3 grid, each corner can participate in three solutions whereas each middle square can participate in only two; the middle square is the most valuable as it can participate in four solutions. I also wanted that the computer would announce its moves, instead of having to look at the grid and figure out what had changed.

Work finished on late Friday afternoon: it seemed that the program was tantalisingly close to completion but I had run out of time. On Saturday morning, I ran the program and saw that it was 99% complete, so with a few additions, we now had a Prolog version of tic-tac-toe that works quite well (the result is always a draw) as well as a much improved Prolog interpreter. Here's the session log for the final run:

> ?- consult (ttt). ttt consulted Yes. > ?- playo. You play X by entering integer positions followed by a period. [1, 2, 3] [4, 5, 6] [7, 8, 9] |: 1. [x, b, b] [b, b, b] [b, b, b] I play 2 [x, o, b] [b, b, b] [b, b, b] |: 5. [x, o, b] [b, x, b] [b, b, b] I play 9 [x, o, b] [b, x, b] [b, b, o] |: 3. [x, o, x] [b, x, b] [b, b, o] I play 7 [x, o, x] [b, x, b] [o, b, o] |: 8. [x, o, x] [b, x, b] [o, x, o] I play 4 [x, o, x] [o, x, b] [o, x, o] |: 6. [x, o, x] [o, x, x] [o, x, o] Cats game! Yes. > quit.

Incidentally, I didn't know what the expression "Cat's game" was supposed to mean. Could it be that someone whose nickname was Cat wrote the program and that this was her way of immortalising herself? No, says CoPilot. “Cat’s game” does mean something. It’s an old American expression for a tic‑tac‑toe draw. The idea is that when neither X nor O can win, the game is “for the cats” — meaning worthless, unwinnable, a stalemate. No cats were involved in the writing of your Prolog code.  I would have known this had I read the Wikipedia article more thoroughly.

I ran a new game: normally I would start with square 5 but just to keep things interesting, I decided to start with square 1 instead. The computer choosing square 2 was not clever; this probably means that the simple rule that chooses the next available square appears before the rule that used the evaluation function. As rules are evaluated in a top-down order, the position of a rule can be critical. After thinking about this, I rearranged the 'moves' rule so that they would appear according to their value; thus the first rule would cause the computer to choose square 5, then square 1, etc. When I ran the program with the rearranged rules, the computer played much better. Again, I chose to start with square 1 deliberately to see what the computer would do; it chose square 5. I then chose square 3, so the computer had to choose square 2, otherwise I would win with my next move. In doing so, the computer had set up the 2-5-8 combination, so of course I had to chose square 8. The computer then chose square 7 - this doesn't contribute anything, but it shows that the computer is playing according to the order of the moves: 5,1,3,7,9,2,4,6,8. As squares 5, 1 and 3 were occupied, the next unoccupied square according to the precedence was 7.

So once again I wonder, have I completed the Prolog interpreter?

Internal links
[1] 2047
[2] 2059



This day in blog history:

Blog #Date TitleTags
23018/01/2010Keith Tippett Group – "Dedicated to you, but you weren't listening"King Crimson
53818/01/2013Nic Potter RIPObituary, Van der Graaf Generator, RIP
91818/01/2016MP3 headphones (2) / The scientific methodProgramming, MP3
137018/01/2021Covid-19 vaccination (3)Covid-19