Monday, October 26, 2009

Entity relation diagrams (ERD)

These are diagrams which show database tables, the fields in each table and the relationships between tables. I've been looking on and off for a simple and free program which will display erds, and which will preferably extract the information needed automatically from a database.

Today I found such a program: IBUtils. Loading a database and displaying its structure is a bit awkward initially, but once the data is on screen, it's plain sailing.

Here is the erd for the database which I partially described the other day:

I have yet to determine whether red lines are different from green lines.

Sunday, October 25, 2009

The importance of building indexes

Or as I would say, with the benefit of a classical education, the importance of building indices.

I have written a psychological testing application, in which the user is presented with a list of words, and s/he has to choose ten words which very much describe himself, then choose words which partially describe himself, and words which do not describe himself. The application itself works fine, but I was interested in exploring the meta-data possibilities: which words have been most frequently chosen in the first category, and which words have never been chosen in the first category. The first query was not a problem, but the second (which words have never been chosen) leaves me stumped.

The table structure is as follows:
table words: id, name
table choices: pid (person id), wid (word id), class (value between 1-6)

Presumably the answer involves a left join between words and choices, but there has to be a modifying statement - where choices.class = 1 - and this is causing me problems.
I posted this question on the excellent Stack Overflow site, and received some interesting answers. One person suggested the following query:

SELECT name
FROM words
WHERE id NOT IN
(SELECT DISTINCT wid  
FROM choices
WHERE class = 1)

When I ran this query, it took a staggeringly long 36 seconds to produce the answer! I had been using a two query strategy which took maybe one or two seconds, so to receive a pure SQL answer which took much longer seemed beside the point. When I responded, telling how long the query was taking, it was suggested that I add an index on the fields (wid, class) to the choices table.

Once I did this, the execution time dropped to 62 milliseconds. Yes, that's right: 580 times faster!! Later someone suggested this query

SELECT name
FROM words
WHERE NOT EXISTS
(SELECT 1
FROM choices
WHERE choices.class = 1 and choices.wid = words.id)

I don't know how long this would have taken without the index, but with the index it took only 31 milliseconds. When running the program, it seems that the data appears before they have been requested. This query is fascinating in its simplicity and I shall try to remember it. It's actually the same syntax which is used in a report at work to discover parts which don't have a bill of materials.

So: it's very important to index fields which are used in a query. Armed with this knowledge, I checked a few other queries, both in this application and in our flagship app. After adding a few indices (all right, indexes), one query became three times faster and another twice as fast.

Even without the speed-up gained by the index, there is another point to be considered: the algorithm. In one part of the program (effectively the most important part, where the user's results are displayed, so speeding this part up is much more important that improving a non-essential query), I was calling the same query in a loop which ran six times (each loop invocation would call the query with a different parameter). It wasn't too difficult to see that I could remove the loop, call the query once for all the data, and then massage the returned data in code.

My attitude has always been to get the program running first with simple code and correct results; sometimes I forget to revise the program and substitute more complicated code which runs much faster.

Saturday, October 24, 2009

Wizz Jones: The legendary me - and musings on music samplers

I wrote some time ago about my musical adventures in 1971, a year which in retrospect was very important for the formation of my musical tastes. In that entry, there was a reference to local (to Bristol) folk artist Dave Evans; Dave recorded for a local record company which has since apparently become legendary, "The Village Thing". They had an office close to my school, and during lunch-hours I often used to go to that office.

The first release that I purchased from TVT was a four track mini-single called "The Great White Dap" (a dap is the primordial ancestor of what is now called a trainer, i.e. a soft shoe which was worn in sporty events). This featured one song each from the first VT releases: Ian A. Anderson, The Pigsty Hill Light Orchestra (somewhat similar to The Temperance Seven or The Bonzos), a Welsh duo called 'The Sun Also Rises' and guitarist Wizz Jones.

Jones' contribution was called "See how the time is flying", which I learnt to play in short order. This was always an intriguing song harmonically, as its verse starts in Dm and ends in Em. Instead of trying to make a harmonically pleasing transaction between these keys, Jones simply starts the next verse with an abrupt change to Dm.

Later on, TVT released a sampler lp with twelve tracks, including Jones' "Beggarman". Again, I enjoyed this song, but apparently not enough in order to buy its parent album, "The legendary me". Obviously my money at the time had more important claimants.

Fast forward maybe thirty five years; money is no longer a problem and anyway my record buying days are virtually over. I began a search for those Village Thing records of my youth, and quickly found Dave Evans' initial record which I wrote about in that blog entry, "The words inbetween". Lately I have found as downloads Hunt and Turner's "Magic Landscape" and Ian A. Anderson's "A vulture is not a bird you can trust". I have yet to find its predecessor, "Royal York Crescent", named after the street where Anderson lived at the time, and where he recorded some of Evans' songs. I used to visit him there.

- digression
I think that Jim Hunter, my English teacher at school, lived there as well, and he told me that another resident at the time was novelist Angela Carter, whose early book "Several perceptions" had become essential reading for me, and even the basis for an exercise in turning prose into poetry. I realised that the district in which Carter's book was set was indeed the arty Clifton district of Bristol, where the above crescent is situated. Hunter wrote a biography of poet Gerard Manley Hopkins, which led to the school graffito "Hunter puts the man into Gerard Manley Hopkins".
-end of digression

For several years, I have been on the trail of a cd release of Wizz Jones' "The legendary me", and eventually found it a few weeks ago. I've been listening to it on and off the past week, and reluctantly have come to the conclusion that the two songs which I knew in 1971 were the highlights of the disk, and that had I heard some of the other songs, I wouldn't have bothered tracking down this disk. In certain aspects, Jones might have been a role model for my music - clean acoustic guitar picking, roughish vocals - had my harmonic vocabulary not been irrevocably twisted by Sandy Denny's "North Star Grassman" in the autumn of 1971.

In the early 1970s, the sampler was a legitimate way of allowing impecunious people the chance to hear a wide variety of artists signed to a label and so inform them of the quality of those artists. The Island samplers ("You can all join in", "Nice enough to eat", "Bumpers" and "El pea") immediately come to mind, but there were also successful samplers from CBS ("The rock machine turns you on"), Harvest, Vertigo and others. Versions of these samplers can now be found in the many cd compilations possibly saturating the market (for example, "Refugees: A Charisma Records Anthology 1969-1978 [Box set]".

I have often mused on the selection criteria for those samplers: did the compilers pick the best track on an album for inclusion in the sampler? Did they pick the most accessible song? Did they pick the most representative song? In terms of Wizz Jones, they (presumably the afore mentioned Anderson) definitely picked the best songs, but when I consider "Nice enough to eat", which sufficed as many a young gentleman's introduction to 'underground' music (and to the Island label), then I'm not so sure.

Certainly, Fairport Convention's track on the sampler, "Cajun woman", was neither the best, most accessible nor most representative track from its parent album, "Unhalfbricking". In my opinion, it was the worst track on this ultimate record, not too accessible and certainly not representative (it was more an experiment in applying a Cajun arrangement to what would shortly be called a modern song in the trad idiom).

Another criterion seemed to be that compilers would pick the opening song from an album, presuming that each album's producer knew what he was doing when sequencing. Thus we have "Time has told me", which opened Nick Drake's "Five leaves left", followed by King Crimson's "21st century schizoid man", which similarly held the pole position on their "Court of the crimson king" debut album. On the other hand, neither Jethro Tull's nor Blodwyn Pig's contribution were the opening track.

Another problem which I have with listening to familiar songs from forty years ago along with unfamiliar songs is that the familiar will always sound better. I hope that I am free of this bias regarding "The legendary me", and can explain lucidly why the two songs which have become part of my musical fabric are better than anything else on this album. Incidentally, the cd release has three bonus tracks added at the end; as I didn't know the original, I can't complain that these songs stick out like a sore thumb, but even so.... The first is a recording of Leonard Cohen's "Sisters of mercy"; whilst adding nothing to the original, Jones inexplicably alters the song's chord sequence, making listening to it a jarring experience for someone versed in the original.

Thursday, October 22, 2009

Dollhouse

One of our satellite tv stations showed the first episode of Fox's new series, "Dollhouse". Intrigued by the promos which have been screened daily for the past few weeks, I decided to record the programme (as it's broadcast at an inconvenient hour for me).

Yesterday morning, I watched the first ten minutes or so, completely unable to fathom what was happening. In fact, I spent most of that time trying to figure out why the leading actress (Eliza Dushko) looked so familiar*, and in doing so, completely missed the fact that an actress that I had admired only the day before (Olivia Williams) was appearing.

During the day, I looked up the IMDB reference and began to understand what the show might be telling me. In the evening, I watched the programme from beginning to end. I needn't have bothered. After the very confusing first ten minutes (consisting of three or four scenes which appear to have absolutely nothing in common aside from the aforementioned Ms Dushko), the programme then turned into a high action, low meaning adventure story which did not interest me in the slightest. Once again, I am reminded why I don't like watching American tv series (aside from The West Wing and certain cartoon families).

On the basis of this opening salvo, and considering the programme's broadcast hours, I can't see myself watching it again.

* Yes, Ms Dushko was a cheerleader in "Bring it on". She does look vaguely like singer Natalie Imbruglia.

Tuesday, October 20, 2009

Lucky break

I saw the film "Lucky Break" the other night and thoroughly enjoyed it. I don't know why most of the IMDB reviewers dissed it. True, I doubt very much that prison life is as depicted in the film; I would be more afraid of my fellow prisoners and less of the prison guards (there was a moment of such fear when Jimmy Hands had his head covered in a plastic bin liner and taken to see the new block 'king').

Any film with James Nesbitt, Bill Nighy, Timothy Spall and especially Olivia Williams is going to be a pleasurable experience. Nighy and Spall join forces again after their appearance in "Still crazy".

Side note: Nesbitt's sidekick, Rudy, is played by Lennie James. I was surprised to see that Lennie appeared in 'Cold feet' as Rachel's first husband (Kris "with a K"), whereas Nesbitt was Rachel's second husband (Adam). So, in one instance, actors Nesbitt and James were antagonists whereas in another instance, they were best mates.

A smaller but similar coincidence happened in the BBC series 'Survivors' about which I was going to write but decided not to; in my opinion, this series did not live up to its potential and was very scattered at the end. Anyway, lead actress was Julie Graham, and one of her tribe was played by the ubiquitous Paterson Joseph, who seems to be cast in a huge number of television shows (he even appeared in one or two episodes of 'Cold feet'). Julie played the eponymous Mary in 'William and Mary', and Paterson played Reuben, the father of her children.

Thursday, October 15, 2009

Another day not wasted

Today I was considering how to improve the general look of one class of my programs. These have a dbgrid and several buttons, and whilst they are functional, they are somewhat old fashioned. As a picture is worth a thousand words, here is a screen shot of the main form of one of the programs (I should point out that these programs display personal data, which in the interests of privacy I am not going to show here, so I'm showing the form as it appears at design time. Imagine that the grid is populated).

The various components which appear in the grid are non-visible.

The first improvement was to take all the buttons and place them on a gray panel

This is certainly better, but the real problem is the grid, which is "so Win 3.1". What can be done? Dispose of the grid! I decided to display the data in a listview control, which is where the fun begins. I need the listview to be displayed from right to left, which requires a magic code. Once I had done that, I discovered that the header of the listview considers itself to be autonomous; this resolutely stayed as left to right as the lines displayed right to left, making an ugly sight. More searching of the internet revealed how to get a pointer to the header of the listview, which was then fed the magic code. Success!

Of course, I couldn't leave things as they were; I had to figure out how to change the sort order data in the listview whenever a header was clicked. This turned out to be slightly easier than I thought. I also added code to allow editing (via a subform) of a row, and to display any change that might be made in the data. This required an extra query component to get the new data for the chosen record.

Here is the new screen (note that the header is still left to right):


And here is (some of) the code:

type
 TForm1 = class(TForm)
   lv: TListView;
   Panel1: TPanel;
   EditBtn: TBitBtn;
   BitBtn2: TBitBtn;
   SQLConnection1: TSQLConnection;
   SQLQuery1: TSQLQuery;
   qPerson: TSQLQuery;
   procedure FormCreate(Sender: TObject);
   procedure BitBtn2Click(Sender: TObject);
   procedure EditBtnClick(Sender: TObject);
   procedure FormActivate(Sender: TObject);
   procedure lvColumnClick(Sender: TObject; Column: TListColumn);
 private
   LastSortedColumn: integer;
 public
end;


var
 Form1: TForm1;

implementation
{$R *.dfm}
uses unit2;

Function SortByColumn (Item1, Item2: TListItem; Data: integer): integer; stdcall;
begin
 case data of
   0: Result:= AnsiCompareText (Item1.Caption, Item2.Caption);
   1, 2: Result:= AnsiCompareText(Item1.SubItems[0], Item2.SubItems[0]);
   3: if strtodate (Item1.SubItems[2]) < strtodate (Item2.SubItems[2])
       then result:= 1
       else result:= -1
 end;
end;

procedure TForm1.FormCreate(Sender: TObject);
const
 LVM_FIRST = $1000; // ListView messages
 LVM_GETHEADER = LVM_FIRST + 31;

var
 ListItem: TListItem;
 header: thandle;
 i: integer;

begin
 header:= SendMessage (lv.Handle, LVM_GETHEADER, 0, 0);
 SetWindowLong (header, GWL_EXSTYLE, GetWindowLong (header, GWL_EXSTYLE) or
                                WS_EX_LAYOUTRTL or WS_EX_NOINHERITLAYOUT);
 SetWindowLong (lv.Handle, GWL_EXSTYLE, GetWindowLong (lv.Handle, GWL_EXSTYLE) or
                                WS_EX_LAYOUTRTL or WS_EX_NOINHERITLAYOUT);
  lv.invalidate; // get the list view to display right to left
  lv.items.beginupdate; // reduce flicker

  with sqlquery1 do
   begin
    open;
    while not eof do
    begin
     ListItem:= lv.Items.Add;
     ListItem.Caption:= fieldbyname ('zehut').asstring;
     for i:= 1 to 4 do ListItem.SubItems.Add (fields[i].asstring);
     next
     end;
    close
   end;

 lv.Items.endupdate;
 qPerson.Prepared:= true;
 LastSortedColumn:= -1;
end;

procedure TForm1.lvColumnClick(Sender: TObject; Column: TListColumn);
begin
 LastSortedColumn:= Column.Index;
 TListView(Sender).CustomSort(@SortByColumn, Column.Index);
end;

procedure TForm1.FormActivate(Sender: TObject);
begin
 lv.setfocus
end;

procedure TForm1.BitBtn2Click(Sender: TObject);
begin
 close
end;

procedure TForm1.EditBtnClick(Sender: TObject);
var
 flag: boolean;
 id: longint;
 i: integer;

begin
 if lv.selected = nil then exit;
 id:= strtoint (lv.selected.SubItems[3]);
 with TGetDetails.Create (nil) do
   try
    flag:= execute (id);
   finally
    free
   end;

 if flag then
   begin
    with qPerson do
     begin
      params[0].asinteger:= id;
      open;
      lv.selected.Caption:= fieldbyname ('zehut').AsString;
      for i:= 1 to 3 do lv.selected.subitems[i - 1]:= fields[i].asstring;
      close
     end
   end;
 lv.setfocus
end;

end.

Sunday, October 11, 2009

This tickled my fancy

My dog sheds hair all the time, and someone suggested improving her food by adding a teaspoon of olive oil each day. I'll check the dietary values of her food this evening.

Whilst idling googling dogs and olive oil, I came across this pearl of wisdom:

Q: If you give olive oil to your dog, will it make my dog's nose any wetter?


To which the reply was "No, your dog's nose won't get wetter if I give my dog olive oil".

Someone has a problem with pronouns.

Friday, October 09, 2009

Firebird triggers and autoincrements

I often say that a day in which nothing new is learnt is a day wasted. Today will not be wasted.

One of the good things about the Borland Database Engine (BDE) is that it has an 'autoincrement' type; I would use a field of this type to provide primary keys for almost all database tables.

Firebird (FB) per se does not have an autoincrement type, which at first was a setback when I first started developing for this database manager. I use an oldish version of "EMS SQL Manager 2005 for InterBase & Firebird Lite" to create the databases, and this allows one to define fields as autoincrements. Under the hood, this program is creating two entities connected to the autoincrement field: a generator and a trigger. The generator starts with a seed value and whenever it is called, it increments the seed and returns the new seed; thus every call to the generator is guaranteed to get a value which does not exist in the database. Since I have been porting applications to FB, I have been calling the generator manually and getting the new key value.

Until now I have ignored the trigger which is also defined. Today I was defining a new database and decided to look at the trigger:
CREATE TRIGGER BI_PEOPLE_ID FOR PEOPLE
ACTIVE BEFORE INSERT
POSITION 0
AS
BEGIN
IF (NEW.ID IS NULL) THEN
NEW.ID = GEN_ID(PEOPLE_ID_GEN, 1);
END;

Turned into English, whenever a program is about to enter (insert) new data into the 'people' table, if the value of the 'id' field is null then the trigger will automatically call the generator and obtain the new value. This means that I don't have to call the generator manually which means that I can simplify my program code.

Almost all the programs I write for my occupational psychologist read in a data file which has been prepared by an 'exam' program run by an examinee; these data files' contents can be divided into two parts, where the first part is the examinee's personal data and the second part is the examinee's results.

The first part will go into the 'people' table, and I will need the id number of the inserted tuple in order to use it as a foreign key during the insertion of the 'results' tuple. Thus it's more efficient to call the generator manually for the 'people' table before inserting the data as I can save the value for later. But as the 'results' table has an autoincrement field as its primary key, I don't need to call the generator manually for this as I don't need the resulting value for anything other than as a primary key for the current tuple.

To give an explicit example, if the 'results' table has the structure
id: autoincrement (primary key)
pid: longint (foreign key, points to person whose results these are)
subject: longint
score: smallint

Then instead of writing queries like this
insert into results
values (:p1, :p2, :p3, :p4)

I can now write this query
insert into results (pid, subject, score)
values (:p2, :p3, :p4)

and have the pre-insert trigger automatically generate the value for the primary key. If the 'insert' statement does not have a list of named fields then it automatically inserts data starting with the first field in the table.

Actually, I know that in the above situation, I could probably do away with the 'id' field, and have the primary key based on the fields 'pid' and 'subject', in which case using a trigger is beside the point.

Unfortunately, it seems that I will be able to use the trigger method (as opposed to calling the generator manually) infrequently, and then only for tables whose values are entered programmatically. Many tables are populated by data entry screens, where the database table is buffered by a client dataset, and in such cases it is impossible to get the trigger to fire, as a 'field requires value' message appears.

What is important about today's efforts is that I am thinking how to use the dbExpress/Firebird combination in the most efficient manner. This is all still relatively new material to me, whereas the BDE code has had over a decade to settle and improve.

Thursday, October 08, 2009

The girl who played with fire


In view of the favourable, but mixed, reaction that I had to Stieg Larsson's first book, "The girl with the dragon tattoo" (GDT), I decided to purchase and read its followup, "The girl who played with fire" (GPF). In retrospect, I wish that I hadn't bothered. The good parts of GFT far outweighed its flaws, but there's no way that I can say that about GPF. This book is in sore need of an editor, and unfortunately it's far too late for that.

Again, I am torn between holding back the spoilers for those who have not read the book, and giving specific examples of poor editing. When I write "editing", I don't mean copy editing, which is a technical function, but rather the equivalent of what a record producer does: gives the raw material a direction, chops out the irrelevant and sequences the raw material in the best order possible.

The first two hundred pages (approximately) are almost totally irrelevant to the rest of the book. The material about Salander would be better served by placing it in a collection of novellas and removing it from this book; it feels like a warm-up exercise and has no part in GPF. I am reminded of the late Roger Zelazny who prior to beginning a novel would write short stories about his main characters in order to get a feel for them. These pages give one the (post-reading) feeling that Larsson was making the whole thing up as he went along, and inserted events (or "hooks", as the musician or computer scientist might call them) as they occurred to him. If later events revolve around prior knowledge which is given by these hooks, then the reader feels satisfied, but if the hooks are left unresolved, then the feeling is awkward. I found myself reading the opening pages wondering where the events described would lead, and in the end, they led absolutely nowhere.

I find myself comparing the structure of (the rest of) the book to that of a police procedural. Such books generally begin with a murder, and then focus on the detective's attempts to solve the crime. Had GPF began with the murders and then focused on the detective team's attempts, then it would have been on more solid ground (and perhaps more predictable). Larsson makes the book extremely complicated by having three (or even four) different teams trying to solve the murders and it makes for exceedingly complicated reading, whilst dissipating the thrust of the story. Where was Salander all the time her liberally enhanced sordid past was being splashed over all the newspapers in Sweden?

I sincerely hope that Swedish policemen are more professional than some of those portrayed in this book, as their attitude towards someone who is supposed to be a witness and possibly a suspect is remarkably tabloid (and some of the policemen's attitude towards one character who wasn't even a witness was deplorable).

After a while, the police procedural theme disappears and the book becomes more of a crime thriller, at which point any points which the book might have accrued slip away with remarkable speed. There are still unresolved hooks spread around (for example, those connected with Erika Berger) which dilute the story's impact.

In order to check the book's literary style, I opened the book a few times at random. The first thing that I noticed was that most of the pages were written in direct conversation between characters; the second was that the non-literal descriptions were sorely lacking in adjectives. This gives the prose a very straight-forward and brisk attitude but it doesn't make reading it a pleasure.

On the basis of GPF, I can't see myself reading the third part of the "Millenium trilogy", "The girl who kicked the hornets' nest". I would probably see any film made from these stories but I won't be investing any more money or time in these books.

Wednesday, October 07, 2009

Pressure cooker

Whilst reading the cookery column of Friday's newspaper the other day, I resolved to buy a pressure cooker. After a little research on the Internet, I went down to the local shopping mall and found myself a 9 litre cooker for 230 shekels (about $55). At first I thought there must be something wrong with the pot as the lid didn't seem to fit, but one of the sales staff showed me how to close the lid hermetically and thus keep all the steam inside.

I don't have a recipe book yet for this type of cooking, and I was warned not to try and experiment. I found a recipe for "Italian inspired chuck roast with herbed red wine gravy" which seemed like a good place to start. I marinaded overnight chunks of goulash beef in a mixture of olive oil, wine and vegetable soup, then drained and dried the chunks. In the pressure cooker I heated oil and then browned the meat; after removing it, I fried a chopped onion, then added two cups of water, the meat, and chopped sweet potato and pumpkin. I then closed the lid of the pressure cooker as instructed and waited for steam to emerge; when it did, I closed a valve on the cooker's lid, reduced the heat and waited 30 minutes. After that time, I removed the pot from the heat, waited about five minutes, then opened the valve, allowing all the steam to escape. When the cooker showed signs of returning to normal, I removed the lid and decanted the contents into a bowl.

Somehow the amount of liquid had doubled during cooking, leaving me with a large amount of broth. The meat was tender and good. My wife had been very doubtful about my purchasing the cooker, but when she tasted the meal - and heard that the cooking time was only 30 minutes - she strongly endorsed the purchase. She has made goulash with this meat before, and has normally boiled the meat for anywhere between two and four hours.

Guess what we're having for dinner on Friday night....

Monday, October 05, 2009

Old computer in new case

I think that I haven't blogged about the problems that I've been having with my home computer. Looking back, I discovered that I bought the computer two years ago, and it's served me very well over that time.

About a month ago, the computer suddenly started turning itself off after about an hour, which made it very difficult to do any serious work with it. After checking, it seemed very likely that the fan in the power supply was no longer working (it had been making very suspicious noises previously and then was surprisingly quiet) and the computer was 'clever' enough to turn itself off before it overheated. I sent the computer to two different places in the hope of getting it repaired, but nothing was done, apart from cleaning out all the dog hairs which had accumulated inside.

After these 'repairs', the computer was still turning itself off after an hour. I spent one Saturday programming, with a very close eye on the clock: whenever an hour would go past, I would save everything, turn the computer off and then do something else whilst the computer cooled down. Obviously it wasn't possible to continue in this manner, and I transferred my development environment to my mobile computer. Whilst developing (or porting) on the mobile has definite advantages (like being able to sit on the balcony in the early morning and work amongst the bird song), I don't like working with the small keyboard, and the screen is poorly lit. I also don't have Office installed on the computer (my company tries to save on licenses), making debugging any automisation code impossible.

I am on holiday all week, so I decided that this would be the time to settle the computer's fate. I opened up the case yesterday and realised that I can't do anything with the computer's innards, as the (very neatly engineered) contents were developed specially for this model. I took the computer to a nearby repair shop, and this time the repairman suggested that he transfer the motherboard, disks and assorted card to a new case (which would have a more standard power supply).

This he did, for the princely sum of 350 shekels (about 90 dollars), and now I have a computer which seems to work properly. It's been on for five consecutive hours, I've done a virus scan, several compilations and downloaded email; the computer has performed perfectly, and is totally quiet to boot.

It seems that the power problems are passed (and past).