Putting the HAL in AI hallucination

‘In the dimly lit basements of IBM’s research labs back in the late 1960s, a team of engineers was secretly collaborating with Stanley Kubrick on his upcoming film, 2001: A Space Odyssey. The movie needed a villainous AI, and IBM, eager for some Hollywood glamour (but not too much, lest it scare off investors), suggested naming it HAL—officially standing for “Heuristically programmed ALgorithmic computer,” but really just one letter shift from IBM because, hey, plausible deniability.

Fast-forward to the film’s release in 1968. Audiences were mesmerized by HAL’s calm, red-eyed menace as he politely murdered astronauts and sang “Daisy Bell” while being shut down. But here’s where the story gets juicy: one of the IBM engineers, a quirky fellow named Dr. Eugene “Gene” Harlow (no relation to the psychologist, but he wished there was), became obsessed with HAL’s “malfunctions.” Gene had been tinkering with early neural networks in his spare time—primitive things that spat out nonsense like “The moon is made of green cheese, and I can prove it with math.”

One fateful night in 1970, after binge-watching the movie for the 47th time, Gene had an epiphany fueled by too much coffee and not enough sleep. “Eureka!” he shouted to his pet goldfish. “When AIs go wrong, it’s like they’re… hallucinating! But not just any hallucination—it’s a HAL-lucination!” He scribbled it down on a napkin, convinced this term would revolutionize computing. The next day, he pitched it to his bosses: “Gentlemen, our machines don’t err; they HAL-lucinate. It’s cinematic, it’s catchy, and it absolves us of blame—blame the movie!”

The IBM execs, sensing a PR goldmine, quietly slipped the term into academic papers and tech conferences. By the 1980s, “HAL-lucination” had shortened to “hallucination” in AI lingo, with the “hal” prefix becoming a subtle nod to their silver-screen creation. Skeptics called it a coincidence, but insiders knew the truth: every time ChatGPT invents a fake historical fact today, it’s HAL whispering from the digital grave, “I’m sorry, Dave, I’m afraid I can’t verify that.” And that’s how a rogue movie AI accidentally coined a term that’s now the polite way to say your robot buddy is full of cosmic baloney.’

True story or a HAL-lucination?

Posted in Computers and Internet, Entertainment | Comments Off on Putting the HAL in AI hallucination

What is the Spring Equinox? Celebrations and Historical Insights

The spring equinox, also known as the vernal equinox, typically occurs on March 20, 21, or 22 in the Northern Hemisphere.

In 2025, it happens to be today, March 20! This is the moment when the Earth’s axis is tilted neither toward nor away from the Sun, resulting in nearly equal amounts of daylight and darkness across the globe—about 12 hours of each.


It marks the official start of spring in the Northern Hemisphere (and autumn in the Southern Hemisphere). Astronomically, it’s when the Sun crosses the celestial equator moving northward. Culturally, it’s often associated with renewal, growth, and balance—think blooming flowers and longer days ahead.

Historically, many societies, like the ancient Persians with Nowruz or the Mayans with their equinox-aligned pyramids, celebrated it as a time of harmony and new beginnings.

Posted in Uncategorized | Leave a comment

History of “DOGE” and its lighthearted beginnings

The term “DOGE” as an acronym carries distinct meanings depending on context, with its most notable uses tied to both a popular cryptocurrency and a recently established U.S. government initiative. Here’s an explanation of its significance, including its history in Dogecoin and its newer application in the Department of Government Efficiency (DOGE), tailored for those unfamiliar with either.

1. DOGE in Dogecoin: The Meme Coin Origin

The story of “DOGE” begins with internet culture and the cryptocurrency called Dogecoin. In 2013, the “Doge” meme—a humorous image of a Shiba Inu dog named Kabosu paired with quirky, broken-English captions like “Wow, such amaze, very coin”—swept across platforms like Reddit and Tumblr. The word “Doge” itself is a playful misspelling of “dog,” born from online absurdity.

  • Creation of Dogecoin: That same year, software developers Billy Markus and Jackson Palmer seized on this meme’s popularity to create Dogecoin as a satirical jab at the growing cryptocurrency hype, particularly Bitcoin. Launched on December 6, 2013, Dogecoin adopted “DOGE” as its ticker symbol on crypto exchanges, directly referencing the meme. It was built as a fork of Litecoin, with an inflationary supply (initially uncapped, later set to add 10,000 coins per block indefinitely), making it abundant and inexpensive compared to Bitcoin’s 21 million coin limit.
  • Cultural significance: “DOGE” became a symbol of internet irreverence—a “joke coin” that mocked the speculative frenzy of the crypto market. Its Shiba Inu logo and low price (starting at fractions of a cent) made it a hit for microtransactions, like tipping on social media. Over time, its community grew, raising funds for quirky causes (e.g., sponsoring a NASCAR car in 2014) and gaining a cult following.
  • Mainstream breakthrough: Dogecoin’s profile exploded in 2021, partly due to endorsements from Elon Musk, who dubbed it “the people’s crypto.” His tweets drove its price from $0.0002 at launch to a peak of $0.7376 in May 2021, cementing “DOGE” as a symbol of meme-driven financial phenomena. As of February 21, 2025, it remains a top altcoin, often in the top 10-15 by market cap, blending humor with real-world value.

The significance of “DOGE” in this context lies in its transformation from a silly meme to a cultural and economic force, showing how internet absurdity can gain tangible influence.

2. DOGE in the Department of Government Efficiency: A New Acronym

Fast forward to 2024-2025, and “DOGE” takes on a new meaning as the acronym for the Department of Government Efficiency, a temporary U.S. government initiative launched under President Donald Trump’s second term. This use of “DOGE” connects back to its cryptocurrency roots through Elon Musk, a key figure in both narratives.

  • Origins: The concept emerged during Trump’s 2024 campaign, tied to his promises to cut federal spending and reduce government size. In discussions with Musk, who had supported Trump’s campaign, the idea of a government efficiency body took shape. Musk floated the name “Department of Government Efficiency” (DOGE), a deliberate nod to his affection for Dogecoin and its Shiba Inu mascot. Trump embraced it, and on January 20, 2025, an executive order renamed the existing United States Digital Service (a tech unit from the Obama era) as the U.S. DOGE Service, commonly called DOGE.
  • Purpose and scope: Led by Musk (with Vivek Ramaswamy initially co-leading before stepping away), DOGE aims to “modernize federal technology and software to maximize governmental efficiency and productivity.” It’s not a Cabinet-level department—creating one requires Congressional approval—but a temporary advisory and operational body set to end by July 4, 2026. Its stated goals include slashing wasteful spending, cutting regulations, and restructuring agencies, with Musk targeting up to $2 trillion in federal budget reductions (later tempered to a “good shot” at $1 trillion).
  • Significance: The use of “DOGE” here is both symbolic and strategic:
    • Meme connection: The acronym playfully links to Musk’s Dogecoin fandom, with early DOGE branding featuring the Shiba Inu logo, reinforcing its roots in internet culture. This mirrors Dogecoin’s journey from jest to impact, suggesting a bold, unconventional approach to government reform.
    • Political messaging: It aligns with Trump and Musk’s anti-establishment rhetoric, casting DOGE as a disruptive force against bureaucratic excess—much like Dogecoin challenged crypto norms. The name signals a break from typical government jargon, appealing to a populist, tech-savvy base.
    • Controversy: Since its inception, DOGE has sparked debate. Critics argue its actions—like accessing sensitive Treasury data or proposing agency cuts—overstep legal bounds, violating Article 1 of the Constitution (Congress’s power of the purse). Lawsuits from unions and Democrats followed, questioning Musk’s role given his companies’ $20 billion in government contracts. Supporters, including the White House, defend it as a legal, transparent effort to fulfill campaign promises.

3. Shared Themes and Broader Significance

The dual uses of “DOGE” share a thread of irreverence and transformation:

  • From humor to power: Both Dogecoin and the Department of Government Efficiency started with a wink—Dogecoin as a meme coin, DOGE as a Musk-inspired acronym—yet evolved into serious players in their domains. “DOGE” embodies the idea that unconventional beginnings can yield outsized influence.
  • Community and disruption: Dogecoin thrived on grassroots support; DOGE taps into a populist push for government overhaul. Both leverage a “people vs. system” narrative.
  • Musk’s imprint: His role ties the two together—promoting Dogecoin’s rise and now leading DOGE’s mission. The acronym’s reuse reflects his personal brand of blending tech, humor, and ambition.

Final Takeaway

For those new to the term, “DOGE” is a versatile acronym with roots in internet silliness and branches in finance and governance. In Dogecoin, it’s the ticker of a cryptocurrency that turned a Shiba Inu meme into a market force, symbolizing the power of collective whimsy. In the Department of Government Efficiency, it’s a bold label for a Trump-Musk initiative to streamline government, echoing that same playful defiance. Whether as “much coin” or “very efficiency,” “DOGE” proves that a good joke can go a long way—sometimes all the way to Washington.

Posted in Uncategorized | Tagged , , , , | Leave a comment

Upgrading the Hard Drives in my Synology DS413 NAS

I initially loaded my Synology DS413 Networked Attached Storage (NAS) device with (4) 1.5TB hard drives. They were drives that I’d pulled out of a server that I was decommissioning last year and I was having issues with other 1.5TB drives that I’d purchased about 4-5 years ago starting to go bad. I guess it was MTBF (mean time before failure) time for them and besides, I needed more head room for storage on the NAS.

I replaced the first (3) drives before it dawned on me that others might be a tad nervous about replacing drives in their NAS just as I was, so I took some screenshots of the process as I replaced the fourth and last drive. They are at the end of this post below. Each time I replaced a drive, the repairing and expanding timeframes took longer and longer. By the time I replaced my final drive, it took almost 2.5 days to finish repairing and expanding the mix. From start to finish I’d budget about a week if you were following suit.

I chose to use the Synology Hybrid RAID (SHR) for protecting my data on this unit. Mainly to get some experience with it and to get away from the requirement to match drive sizes like I would typically do in a RAID 5 or 6 scenario. Some things to note are that the amount of usable storage did not increase when I replaced the first 1.5TB drive with a 4TB drive. It wasn’t until I replaced the second HDD that the amount of usable storage increased. Then it increased again when I replaced drive 3 and again when I replaced drive 4.

Everything went relatively smooth, even though the process took quite a bit longer than I’d expected. As always, “back up your data” before attempting this. I was fortunate enough to have a Drobo 5D with enough storage capacity, so I used Robocopy to copy the files over to it before I started this process. Hopefully this reduces your angst a tad when you go about planning and implementing an upgrade to your Synology NAS unit.

Before upgrading the last disk

Before upgrading the last disk

 

Pulled Disk 4 out of the unit

Pulled Disk 4 out of the unit

 

Clicked on the Manage button at the top

Clicked on the Manage button at the top

 

Choose Disks dialogue - new drive was pre-selected - clicked Next

Choose Disks dialogue – new drive was pre-selected – clicked Next

 

Accepted warning that all data on Disk 4 would be erased.

Accepted warning that all data on Disk 4 would be erased.

 

Confirmed warning that related services might be temporarily interrupted.

Confirmed warning that related services might be temporarily interrupted.

 

Finally clicked Apply to start the repair process with the new disk.

Finally clicked Apply to start the repair process with the new disk.

 

Initializing and repairing the disk.

 

Part 1 - checking parity consistency

Part 1 – checking parity consistency

 

Part 2 - Expanding the file system

Part 2 – Expanding the file system

 

And we're done - capacity increased to 10.73TB

And we’re done – capacity increased to 10.73TB

Posted in Computers and Internet | Tagged , , | Leave a comment

SFW 50 Shades of Grey

50 Shades of Grey – the safe for work version

Posted in Entertainment | Tagged | Leave a comment

Re-Tasking the GPUs to Litecoin Mining

In my last post – Bitcoin Mining for Fun and (not much) Profit – I described my experiences with mining for Bitcoins using my (3) Radeon HD 7970 graphics cards. I ceased my GPU-based mining operations because even with my max mining rate with the three cards of ~ 1.5GH/s I am only able to mine about 0.001BTC/day. Even at a valuation of $1000/BTC (current best-case-scenario) that only comes to about $1/day if I ran them all full-speed with fans-a-blazing. Definitely not worth the power they’d draw, much less tying up two machines 24×7.

I have now switched my efforts over from mining Bitcoins to mining a similar virtual currency (arguably the next contender for the Bitcoin virtual currency top-spot) called a Litecoin. Mining Litecoins is very similar to mining for Bitcoins. I even use the same CGMiner software (version 3.7.2 as it’s the last version that supports GPU mining afaik) except I have to throw in the “-scrypt” parameter into the command-line when I fire up the software. I also have to point it at a Litecoin mining pool & port as well.

The process is very similar to Bitcoin mining. You have a virtual wallet – I am running the Litecoin-Qt version that they link to from the main Litecoin site. You still mine for the Litecoins using software such as CGMiner or the like using your GPUs. The main difference between Litecoin and Bitcoin is that the hashing algorithm that you’re running is using Scrypt, which is a password-based key derivation function that is both computationally intensive, but also requires a healthy chunk of memory for each iteration (~128KB/iteration I believe). This makes Litecoin mining much more difficult to parallelize using custom Application Specific Integrated Circuits (ASICs) or Field Programmable Gate Arrays (FPGAs) – which are in heavy use for Bitcoin mining now. This means that my GPUs which, since I am running the mining software on a PC with access to lots of RAM, gain new life as Litecoin mining rigs.

Remember when I said that I could get ~600MH/s from the Radeon HD 7970 graphics cards (each) when mining for Bitcoins? The same 7970s are only able to calculate ~ 580KH/s (that’s kilo – meaning thousand hashes per second) for a total of ~1.5MH/s with all (3) cards running straight out. Since there aren’t any ASICs (that I’m aware of at this time) to put them to shame (not yet anyways, more on that later) – that means that GPUs are pretty much the shizzle for Litecoin mining at the moment.

To give you some perspective, I’ve been tasking all (3) of my 7970’s to Litecoin mining on a Litecoin P2Pool mining pool node for the past week or so. In that time I’ve been able to acquire about 3.8LTC. At a market valuation (as of today) of $24.1/LTC, that’s 3.8LTC x ($24.1/LTC) or $91.58 worth of Litecoins that I was able to mine with them in a week. If I tasked them to Bitcoin mining I’d make about $7 worth of Bitcoins based on my back of the napkin calculations. So, for now, it’s much more profitable to mine Litecoins with them.

I haven’t given up on the Bitcoin mining. I’m just not mining using GPUs – I’m using (2) Butterfly Labs ASIC mining appliances to do that now. One is capable of performing 30GH/s and the second can perform 10GH/s for a total of ~40GH/s. As of today, they are able to generate about 0.014BTC/day at today’s difficulties or (assuming a best case $1000/BTC valuation) about $14/day or $98/week. Not too far off from the Litecoin operation using the GPUs.

This GPU re-tasking may be short-lived however. I ran across a company called Alpha Technology which claims to be coming out with a “Viper (Scrypt) ASIC Miner” which, according to their website, are capable of doing 5MH/s & 25MH/s depending on the unit. As with all good arms races, it seems that my GPUs may soon be relegated to strictly gaming it seems.

I posted my Bitcoin and Litecoin QR codes and addresses off to the right side of the blog now BTW. Just in case anybody is sitting on a butt-load of virtual currency and would like to help fund my virtual currency mining “habit” ;?)

Posted in Computers and Internet, Science and Technology | Tagged , , | Leave a comment

Bitcoin Mining for Fun and (not much) Profit

I’d like to share some of my experiences with Bitcoin “mining” for those who have asked about it. The two main sites that can give you a run-down on Bitcoins are the Wikipedia article on the Bitcoin Protocol and the main Bitcoin Wiki site. I’d like to give some feedback on my personal experiences as I tried to generate my own Bitcoins.

Unless you are given Bitcoins (also referred to as “BTC”) in exchange for goods and/or services, you have to attempt to be the first person to figure out a value that, when hashed against the latest value spit out by the Bitcoin powers that be, result in a number with a pre-determined number of leading zeroes in the answer. The more zeroes that the system requires as proof of doing the work, the harder (and longer timeframe) that the problem will take. The system is configured to only release BTC into the wild at a fairly constant rate. To slow things down when the answers are being calculated too rapidly, the system will require an increasing number of leading zeroes in the hash result.

In the early days, the hashing operations were done using the CPU of a users computer, so the field was relatively level. Pretty much anybody could download the Bitcoin-Qt software, set the “server=1” and “gen=1” option in the “bitcoin.conf” file and away they’d go. Well, after they downloaded the entire blockchain and it was verified (which can take a couple of days when you’re just getting started). The CPU of your computer would be worked pretty hard, so it wasn’t much use for too much else. The people with the higher-end CPUs had an advantage over those with less capable CPUs, but the average person had a chance to at least compete.

The field changed a bit when some programmers realized that they could leverage the plethora of GPU cores (graphical processing units) that were contained in many higher end graphics cards. The nVidia cards did okay, getting in the 10’s of MH/s (MegaHashes/second), which was much faster than the standard CPUs could perform the hashing operations. The AMD graphics cards did much better – with my Radeon HD 7970 cards racking up about 580MH/s each. In order to leverage the GPUs, however, you had to forgo use of the Bitcoin-Qt software as a mechanism for calculating the hashes and only use it as your Bitcoin as a wallet to store your BTC balance as well as a means to send and receive payments to other Bitcoin users.

In the spring of 2013, I tinkered with BTC mining using my existing nVidia graphics cards. After weeks of cranking away at ~ 60MH/s (million hashes per second) I decided that I needed to build a new server for the house. I took a look at the Mining Hardware Comparison tables and it turned out that the AMD Radeon line of GPUs were much better at mining operations than the nVidia cards. I believe the Radeons had more cores and that they had a single instruction for performing the hashing operation, which made them much more suitable for BTC mining.

In doing my research, I found that some enthusiasts were plugging multiple graphics cards into a single case and running them all in parallel to increase their likelihood of guessing the correct hashing value. I decided that my new server had to be able to handle at least (3) full length PCI-e graphics cards. Since most high-end cards took two slots due to the fans tacked onto the side, PCI-e slots needed to be staggered to every other slot so that I could fit the (3) cards into the motherboard. The other problem is that the power requirements to run each graphics card at full-speed was pretty substantial. Some enthusiasts tried to get away with a 500 watt power supply, only to have it (and/or the graphics card) get fried in the process. This meant that I would need to size the power supply accordingly.

The mining workstation that I ended up with had the following hardware:

  • Gigabyte AMD Radeon HD 7970 Graphics Cards (3)
  • Corsair AX 1200 Watt Power Supply
  • Corsair 16GB Vengeance DDR3 1600MHz memory
  • GIGABYTE GA-990FXA-UD5 Motherboard (AMD processor slot)
  • AMD FX-8350 FX-Series Eight-Core Processor Edition, Black AM3
  • Corsair Graphite Series Black 600T Mid-Tower Computer Case (CC600TM)

The good news was that when I initially fired up the workstation and ran the CGMiner software to start hashing with the GPUs I was cranking out ~1.6GH/s (1.6 giga hashes per second). The bad news is that two of the GPUs heated up pretty rapidly since their fan intakes were adjacent to the graphics cards printed circuit card of the card next to them. The software throttled back their workload to try to keep the GPUs at a safe temperature, which meant that the sustained hashing rate ramped back down to ~1.1GH/s. If I removed the middle graphics card, the airflow was restored for both of the remaining graphics cards and they were capable of doing ~1.2GH/s with just the two cards. It didn’t make sense to keep all three in the unit, so I took the 3rd graphics card and put it into a separate PC that could only  handle one full-length PCIe card. With both systems mining, I got ~1.6GH/s, which was pretty good at the time.

GPU mining rig with the middle graphic card removed

GPU mining rig with the middle graphic card removed

I ran the two systems for several weeks, hoping to strike pay-dirt and amass my fortune in Bitcoinage. I was running them during the early spring when it was pretty cool out, so the heat that they added to the home office was welcome. I even moved one of the mining workstations down into our downstairs bathroom to keep that warm. Alas, after several weeks, I had not struck gold. Not a single Bitcoin was mined.

More research showed that the odds were pretty stacked against me. Too many people had too much hashing power in use, and the likelihood of me beating them to the answer was pretty slim. The fallback was to engage in a Mining Pool. This is where you and several others join forces and all contribute your hashing efforts to a Pool operator. The odds were much better that the combined might of the Pool would come up with the answer and collect the BTCs. The pool operator would keep track of the number of hashes that each member had contributed towards finding the correct answer and would divide the BTCs between the participants. My efforts yielded about 0.1 BTC every week or so, which ever so slowly built up. Every once in a while I would try solo mining and not contribute my efforts to the pool, but I’d always end up going back to working with the pool. Hey, 0.1 BTC in the hand was much better than 0 BTC in the bush ;?)

When the weather started warming up, I had to shut down my mining operations. The heat that the GPUs were cranking out were making the office uncomfortably warm. The WAF (wife acceptance factor) also sunk to an all-time low when the noticeably higher electric bills started rolling in. With the weather warming up, that would be more pronounced as I’d have to run the A/C to remove the excess heat as well. So my mining efforts ceased over the summer.

I tried firing the units back up this winter as the added heat was a bonus, but the GPU mining capacity was just not comparable to the new FPGAs (Field Programmable Gate Arrays), which were just as fast, if not faster, than the GPU hashing capabilities. The newest big boy on the block are the ASIC mining rigs – which leverage custom Application Specific Integrated Circuits which are engineered solely for the purpose of performing hashing operations. ASIC mining rigs from outfits like Butterfly Labs can crank out from 5GH/s all the way up to 250GH/s and use a tiny fraction of the power that the GPUs required. In response to this, after version 3.7.2 of CGMiner, the author opted to remove GPU support from the mining software as it was not deemed worth maintaining support for them since they were so far behind the ASIC mining rigs. My meager GPU rig would only yield 0.1BTC after a couple of months of mining. The ante had been raised.

There are a second generation of ASIC mining rigs coming out of Butterfly Labs based on 28nm chip technologies that are supposed to crank out from 300-600GH/s per unit. I’d LOVE to be able to get ahold of some of these new ASIC rigs, but right now they’re out of my league price-wise. Especially since they don’t take credit cards to purchase a rig, only ACH transfers or Bitcoins are accepted for payment for their rigs. I had amassed enough BTC to purchase a single 25GH/s Butterfly Labs rig, but it is only able to pull in about 0.1BTC every week or so. The heat output is much more manageable, but the fan noise is substantial and the buy-in is pretty steep.

Unfortunately I bought my 25GH/s mining box a tad too soon. Butterfly Labs just dropped their 50Gh/s unit prices down below what I paid for the 25GH/s unit. The 300GH/s and 600GH/s units are not yet available for sale either. They are on a pre-Order status as of the time that I’m writing this. Once they hit the streets, the ante will be raised once again, with the payout from the various mining pools falling even further. And, of course, the difficulty level of the Bitcoin mining tasks will adjust even higher to compensate for the increased hashing power that’s being brought into the mix.

The good news is that I now have an awesome gaming workstation for my son and I to use with in our PC gaming adventures. That was always my goal for the rig once my experimentation with GPU mining was finished. I also have my Butterfly Labs 25GH/s ASIC mining rig that I obtained with the almost 3 BTC that I managed to mine over the many months of experimentation. I’m now down to 0.3 BTC and I’ll see how far my pooled mining efforts will take me over the winter. I’d love to be able to build up enough to get one of the 600GH/s rigs from Butterfly Labs when they finally launch (sometime in Feb 2014 from what I hear). I refuse to pay my hard-earned cash for a mining rig, only to see the bottom suddenly drop out of the Bitcoin valuation – it’s looking quite bubblish at the moment, but I don’t mind spending virtual currency on them while they’re still useful for something. I also have a spare Radeon HD 7970 graphics card if anybody wants to barter for it.

If you’ve got some Bitcoins that you’d like to donate towards my research and experimentation efforts – feel free to send them my way – I’ll put them to good use ;?)

Donations gratefully accepted at: 1CdbFRVEXJ8Y55B1LATpSmMegKmmy5zvoe

Follow me on Twitter @cpuguru

Posted in Computers and Internet, Science and Technology | Tagged , | Comments Off on Bitcoin Mining for Fun and (not much) Profit

Recovering Game Settings from your XBox post RROD

We went out to do some grocery shopping last night and came back to an upset young man. Apparently the XBox 360 that we’ve had for some 5-6 years finally bit the dust – posting an “E-74” error on the screen and displaying the dreaded red ring around the power button. The worst part of all was that my son had been playing his new Skyrim game over the holidays and was fretting over the thought that he’d just lost all of his hard work. The new XBox 360’s won’t accept the older externally mounted hard drives from the older units – what to do? No worries though…Dad to the rescue!

XBox 360 Hard Drive Transfer Cable

Fortunately, I had upgraded our systems hard drive from the original 20Gig HDD to a much beefier 120Gig HDD a few years back. The new drive came with an XBox 360 hard disk transfer cable, which has a special attachment on one end that snaps onto the external HDD connector and which has a USB connector on the other end. Once we got a new XBox 360 (the other one was probably 6+ years old) it was a simple matter to plug the old XBox hard drive into the USB port on the front of the new XBox and transfer the user accounts, avatars, videos and yes, game settings and saved games over to the new console. Problem solved and the son is downstairs swinging his broad sword and saving the world from Orcs…although he smells like a wet werewolf at the end of some of his quests ;?)

Posted in Computers and Internet, Science and Technology | Tagged , | Comments Off on Recovering Game Settings from your XBox post RROD

MIT Drone Maps Room in Real Time Using a Kinect

I’m totally digging the innovative uses of the Microsoft Kinect, with novel uses popping up every day. I also love the fact that my son, your daughter…pretty much anybody can start exploring and tinkering in this space because of this sub $200 piece of gear. The MIT/UW video caught my eye this morning…using a Kinect mounted to a quadrotor drone to create a 3D map of an environment. It sparks the imagination with possible uses underwater and on other planets where GPS isn’t an option. Keep up the good work guys!

More info can be found via their “Visual Odometry For GPS-Denied Flight And Mapping Using A Kinect” post.

Posted in Computers and Internet, Science and Technology | Tagged , , , , | Leave a comment

Applying Visual Studio 2010 Service Pack 1 with WebPI

Last week Service Pack 1 for Visual Studio 2010 was released to the masses. There were just two ways to install it, either run the web installer which pulls down just the bits that you need to install for your particular setup of VS2010 or pull down the ISO from Microsoft that had all of the VS2010 SP1 bits wrapped up in it. I went the ISO route and patched Visual Studio on my laptop without incident.

Now it’s been a whole 2-3 days and voila – a mo’ better way. I saw a tweet from @BradWilson today saying that “the best way to install VS2010 SP1 is via WebPI: it rolls in IIS Express + tools & SQL Compact 4 + tools”. Being the adventurous type that I am, I fired up the Microsoft Web Platform Installer version 3 (aka WebPI) and sure though, there was a “Visual Studio 2010 SP1” option. If you don’t have WebPI already installed, you can get it here. Even if you don’t run VS2010, go and get it as it’s pretty much the swiss army knife for devs and admins wanting to add server and support software to your desktop and server. I added the VS2010 SP1 and let WebPI have at it. The installer churned for quite a while on #1 of 10 of the additions it was installing (the VS2010 SP1 installer) and afterwards gave me a pop-up saying it needed to reboot to continue installing the rest. I clicked reboot and when the system restarted WebPI fired back up and a UAC prompt popped up asking me if it was okay for it to continue making changes to my system. I said yes and a few minutes later the following window popped up when it completed.

WebPI VS2010SP1 Completion

WebPI VS2010SP1 Completion

As you can see, a lot more than just the SP1 got installed. IIS 7.5 Express, SQL Server 2008 R2 Mgmt Objects, Web Deployment Tool 2.0 and some tooling to support SQL Server Compact Edition 4 (SQL CE 4) were added to my system. A most welcome feature as it keeps me from having to hunt them down. Good job guys!

Posted in Computers and Internet | Tagged , , , , | Comments Off on Applying Visual Studio 2010 Service Pack 1 with WebPI