After two massive hurricanes impacted Puerto Rico three months ago, the island was left with extensive damage to its electrical infrastructure. Part of the problem was that the infrastructure was woefully inadequate to withstand a hurricane impact at all. It is possible to harden buildings and infrastructure against extreme weather, and a new plan to restore Puerto Rico’s power grid will address many of these changes that, frankly, should have been made long ago.
Among the upgrades to the power distribution system are improvements to SCADA systems. SCADA allows for remote monitoring and control of substations, switchgear, and other equipment which minimizes the need for crews to investigate problems and improves reliability. SCADA can also be used for automation on a large scale, in addition to the installation of other autonomous equipment meant to isolate faults and restore power quickly. The grid will get physical upgrades as well, including equipment like poles, wire, and substations that are designed and installed to a more rigorous standard in order to make them more wind- and flood-tolerant. Additional infrastructure will be placed underground as well, and a more aggressive tree trimming program will be put in place.
The plan also calls for some 21st-century improvements as well, including the implementation of “micro grids”. These micro grids reduce the power system’s reliance on centralized power plants by placing small generation facilities (generators, rooftop solar, etc) in critical areas, like at hospitals. Micro grids can also be used in remote areas to improve reliability where it is often impractical or uneconomical to service.
While hurricanes are inevitable in certain parts of the world, the damage that they cause is often exacerbated by poor design and bad planning. Especially in the mysterious world of power generation and distribution, a robust infrastructure is extremely important for the health, safety, and well-being of the people who rely on it. Hopefully these steps will improve Puerto Rico’s situation, especially since this won’t be the last time a major storm impacts the island.
There’s more than that of course, but the wind farms that [Jason Staggs] and his fellow researchers at the University of Tulsa had permission to access were — alarmingly — devoid of security measures beyond a padlock or tumbler lock on the turbines’ server closet. Being that wind farms are generally in open fields away from watchful eyes, there is little indeed to deter a would-be attacker.
[Staggs] notes that a savvy intruder has the potential to shut down or cause considerable — and expensive — damage to entire farms without alerting their operators, usually needing access to only one turbine to do so. Once they’d entered the turbine’s innards, the team made good on their penetration test by plugging their Pi into the turbine’s programmable automation controller and circumventing the modest network security.
The team are presenting their findings from the five farms they accessed at the Black Hat security conference — manufacturers, company names, locations and etc. withheld for obvious reasons. One hopes that security measures are stepped up in the near future if wind power is to become an integral part of the power grid.
All this talk of hacking and wind reminds us of our favourite wind-powered wanderer: the Strandbeest!
Marketing and advertising groups often have a tendency to capitalize on technological trends faster than engineers and users can settle into the technology itself. Perhaps it’s no surprise that it is difficult to hold back the motivation to get a product to market and profit. Right now the most glaring example is the practice of carelessly putting WiFi in appliances and toys and putting them on the Internet of Things, but there is a similar type of fiasco playing out in the electric power industry as well. Known as the “smart grid”, an effort is underway to modernize the electric power grid in much the same way that the Internet of Things seeks to modernize household appliances, but to much greater and immediate benefit.
To that end, if there’s anything in need of modernization it’s the electric grid. Often still extensively using technology that was pioneered in the 1800s like synchronous generators and transformers (not to mention metering and billing techniques that were perfected before the invention of the transistor), there is a lot of opportunity to add oversight and connectivity to almost every part of the grid from the power plant to the customer. Additionally, most modern grids are aging rapidly at the same time that we are asking them to carry more and more electricity. Modernization can also help the aging infrastructure become more efficient at delivering energy.
While the term “smart grid” is as nebulous and as ill-defined as “Internet of Things” (even the US Government’s definition is muddied and vague), the smart grid actually has a unifying purpose behind it and, so far, has been an extremely useful way to bring needed improvements to the power grid despite the lack of a cohesive definition. While there’s no single thing that suddenly transforms a grid into a smart grid, there are a lot of things going on at once that each improve the grid’s performance and status reporting ability.
If you lived through the Y2K fiasco, you might remember a lot of hype with almost zero real-world ramifications in the end. As the calendar year flipped from 1999 to 2000 many forecast disastrous software bugs in machines controlling our banking and infrastructure. While this potential disaster didn’t quite live up to its expectations there was another major infrastructure problem, resulting in many blackouts in North America, that reared its head shortly after the new millennium began. While it may have seemed like Y2K was finally coming to fruition based on the amount of chaos that was caused, the actual cause of these blackouts was simply institutional problems with the power grid itself.
[Larry] has done this sort of thing before with Amazon’s EC2, but recently Microsoft has been offering a beta access to some of NVIDIA’s Tesla M60 graphics cards. As long as you have a fairly beefy connection that can support 30 Mbps of streaming data, you can play just about any imaginable game at 60fps on the ultimate settings.
It takes a bit of configuration magic and quite a few different utilities to get it all going, but in the end [Larry] is able to play Overwatch on max settings at a nice 60fps for $1.56 an hour. Considering that just buying the graphics card alone will set you back 2500 hours of play time, for the casual gamer, this is a great deal.
It’s interesting to see computers start to become a rentable resource. People have been attempting streaming computers for a while now, but this one is seriously impressive. With such a powerful graphics card you could use this for anything intensive, need a super high-powered video editing station for a day or two? A CAD station to make anyone jealous? Just pay a few dollars of cloud time and get to it!
This project is a wonderful example of what can be accomplished with a rather complicated logic circuit. It’s an Etch-a-Sketch made from a 16×16 LED grid. That in itself is only somewhat interesting. But when hearing about the features and that it is driven by logic chips we were unable to dream up how it was designed. There’s no schematic but the video commentary explains all.
The thing that confused us the most is that the cursor is shining brighter than the rest of the pixels. This is done with two different 555 times and a duty cycle trick. When you turn the trimpots the cursor position is tracked by some decade counters. Pixels in your path are written to a RAM chip which acts as the frame buffer. And there’s even a level conversion hack that let’s the display run at 15v to achieve the desired brightness. Top notch! Continue reading “LED Etch-a-Sketch built without a microcontroller”→