Category Archives: Hardware stuff

Hardware fixes, reviews, thoughts

How are LED Christmas lights wired?

Straight out the box

Straight out the box, Holidy time warm bright white mini lights

If you’re a bit like me, you may have wondered how Christmas (or any strands) of light are actually connected. I’ve sort of half figured it’s probably super obvious if I decided to actually look at them or take a few measurements, sometimes looked briefly at the mess of wires and realized that it’s not totally obvious from the outside, but I’ll look at it later sometime. Well, today was sometime – playing with microcontrollers I’ve used LEDs from a few strands I got last year after the season (they’re often $1.5-$2/50, or 3-4 cents/LED). So before messing up another one, I looked more carefully.

xm_org

Rearranged into chunks of like parts

Here they are after untwisting, untangling and generally wrangling all the wire into straight(ish) lines, then re-bunching lengths of wire and lengths of straight lights (light wired to light, no extra wires). Clearer? Well, since it’s still not 100% obvious, here’s some labels.

with labeled parts

with labeled parts

The lights by the green arrows with three wires going into them are probably the least clear part. It’s worth noticing (because it looked odd unraveled) that the two center lengths (cyan arrows) of wire are between two piles of half the lights, as though they were several feet apart rather than end-to-end as they originally appeared. First a peek at the three-wire thing.

Three wire socket closeup

Three wire socket closeup

Without the bulb, it’s pretty easy to see that two of the wires are just connected. It’s a T junction with and LED in on the part heading off toward the rest of the lights. This is the case on all four of them.

xmbulbAs an aside, this is what’s inside the removed bulb, after unbending the legs and pulling out the bulb. It’s, well, and LED. Little dip on top to diffuse it, then a bigger diffuser over that to make it look more like a “normal” bulb, and a little socket to connect the whole thing. The legs are longer on one side to signify polarity (just like normal LEDs bought as components).

as a circuit

as a circuit

So simplifying the 3-wire-lights and drawing the wires as wires, it’ll look like this. The power is connected straight through (potentially to more strands) and in between there are two sets of 25 LEDs connected. Each side is connected in the opposite direction. This is how a normally direct current low voltage LED can run off 110v alternating current straight from the grid. Since there’s 25 of them, they experience a 110/25 = 4.4v drop each, which is fairly appropriate. There are resistors on the legs of a few to cut back on the current (didn’t get a picture, sorry). Those look slightly randomly inserted (soldered onto one of the legs of a bulb) which kinda of makes sense I guess – doesn’t matter where as long as it’s somewhere in each bunch. Since they’re light emitting diodes they only allow current in one direction, so when the AC is turned the other direction, they’re not conducting (or lighting up).

During that phase, the ones in the opposite direction come on instead, then after 1/60th of a second it switches. For a while in between, both are dark. It’s all so fast that I could probably just pretend it’s not noticeable, but actually if you look at them, they actually flicker pretty obviously. If you move a bulb rapidly through the air in front of you so that you see a streak of light, you’ll notice that it’s not actually a streak, it’s a dotted line as it goes on/off while you move it. If you hold one from each bunch next to each other and do the same, you can see that each line is “there” at opposite times, like -_-_-_-_. I tried to get some pictures, but persistence of vision phenomena like that (similar to a tv screen) doesn’t really show up well on camera, especially not one ones without a shutter speed setting..

schema  So what about the lengths of wire? Well, here they are drawn closer to scale/position. As you can see, if we require the power in and out at the ends (rather than the center) we do need another wire going forward, meaning we have to match the length once just to have our ground/zero/cathode to connect it to, and then, as they block opposite-direction flow, also an open vcc/power/anode too to connect the other bunch. in circuit, that means there will be four “spans” of wire, and two are indeed between the lights.

That’s about all of it. In short, yes, it was a pretty trivial circuit once unraveled and looked at. But if you wanted to see without the hassle of untangling one, there you go. Now on to seeing what would be the most reasonable way to individually control them.

ONN Model NB14W1201 2000mAh Power Bank

Onn Powerbank model ONB14W1201, $2 clearance buy, but handy and functional. Reading a bunch last night about Li-ion chargers, I had a sudden urge to look inside.

Peeking into it. Wait, is that..

..roundness? This looks so much like a plain standard cell, or one of those half-length ones.

It’s a plain ol’ 18650 Li-ion cell with a little driver! The white parts pop apart down the other side, btw..

Circuit closeup

Looks like standard ICs and they left the numbers on. I’ll look into it more later, but these look like what I bumped into constantly hunting obscure Shenzen parts no one has sheets for.

Driver Board

Another for good measure in case something is out of focus.

It’s spot welded in (as internal batteries or individual cells usually are – the strips are often zink plated stainless to handle the heat changes. But it could probably be modified to charge 18650 (probably the most common Li-ion cell around and the basis for most “normal” high power rechargables) or to recharge your phone with infinite 2000-2300 mAh chunks for a few bucks a cell. Possibly even expanded with a few in parallel but that might get adventurous current wise. Will have to see what the ICs claim to be.

MSI a6200-462US (MS-1681, CR620) getting a cpu upgrade, P6100 to i5-540M

MSI a6200-462US, cr620 or MS-1681 (according to MSI, Europe and it's bios, respectively)

MSI a6200-462US, cr620 or MS-1681 (according to MSI, Europe and it’s bios, respectively)


A friend of mine called the other week excited to have gotten a new laptop. Rightfully so, it was one of the new MSIs that run internal GeForce and one of the newer i-cores – looks awesome. With any luck the trend of giving more of the heat dispersal/power consumption (which is really the upper bound in laptops) cake to gpus will continue as more software gets on board with parallelizing things that really don’t have to be done one at a time, even “normal” people stuff. Looking at the internals (on the web – different continent) it was surprisingly close to my several years old workhorse, an MSI a6200-462US (sometimes called cr620 internationally, MS-1681 if you ask the bios, and possibly other things as well). I peeked, slightly hopeful, into if there’s any easier way to run better graphics chips in them now. There isn’t (no surprise), could still cram an extrenal desktop geforce into an enclosure and run it over PCIe-1x on a riser cable into the on-board WiFi card slot or the external expresscard slot (currently a USB3 port) but that’s kind of pricey and tricky modding for something not too terribly functional or portable. But reading about it, I looked a bit at the cpu again.

If you look into putting a beefier CPU into a laptop, you’ll find thousands of message board posts where the first guy says “Could I..?” and four – fifty two saying “No. NO! And even if you could it would be a bad idea and totally not worth it”. I realized that I’d seen nothing much to back that up besides sheer number and volume. So I persisted a little more this time and actually found two (yes, a whopping two) people saying “I did that”. One didn’t elaborate much, the other offered two screenshots of the before/after cpuz and a few pics of the inside saying it worked fine.

Looking even more carefully at mine, which has an Intel P6100 (Intel HD gpu on same socket, 35W total 25 CPU 12.5 GPU) there were even versions of the same laptop running i3 and i5. In theory the motherboard should even be cool with 1st gen i7, though perhaps not so much the bios. So.. started dreaming small – i3-370m. 2.4 GHz instead of 2 even, hyperthreading to four threads, kick passmark up to around 2000 (from 1380, like 45% gain). Power consumption and heat profile identical. The entire laptop, including power supply and bios, has been sold with it already in it. I can’t think of a reason other than cost this shouldn’t work. So that must be what they’re all backing off for, I figure – if it costs as much as a similar laptop (aging but functional), it won’t work. Look it up on eBay and… $4.83. Buy-it-now, with shipping included. You’re totally kidding me. *That* is the “OMGWTFBBQ SOHARD WONTWORK” folks are bitching about? The price of a cheeseburger and eight screws? It’s on.

Opened up

Opened up


P6100 being replaced

P6100 being replaced


I look some more and settle on an i5-540M instead – $20 (again, with shipping) and same profile otherwise. It’s the upper end of things and going much newer it’ll be more like $50-$100 – not bad but a lot to lose if I toast it in three seconds and gain nothing (at $20 I can eat that no problem). It arrives and I open it up. It all goes extremely smoothly. Open the back, off with the cpu shield, turn the ZIF, cpu goes out, cpu goes in. Tighten ZIF, dab o’ thermal grease, everything back on again. Took perhaps 20 min and that’s because I took the time to clean the fan out since it’s already apart (and can’t be done without putting on new thermal compound, so I don’t that often). Turning it on, it was in fact running a little hotter until I had a magic dejavu moment from the last time I cleaned the fan – that time, and this time, I forgot to plug that fan back in. Quick bottom off, plug in, bottom on and it was all better.
My precioussss.. Ok $20 on eBay, but still

My precioussss.. Ok $20 on eBay, but still


Been running it a day and works fine. Running a full load with IntelBurnTest the temps go up to the 80s after a few minutes, possibly a little faster than the old cpu (less than a minute diff) but that’s really going a little over spec anyway – as you may have seen last paragraph a 25W cpu and 12.5W gpu doesn’t in fact add up to the 35W the cooler/power supply is rated for – and not terribly unreasonable. Stuff that used to put in the red doesn’t at all now though. What used to be 100% all-cores load is not like 60% chilling along at 40-45 C. I’ve literally ran jobs that formerly required underclocking it slightly (locking it to 1.7 GHz) or taking breaks without even having the fan turn on. Running passmark it benches 2800, almost exactly twice twice the former 1390.
New cpu in place

New cpu in place


So.. I guess if you hear a lot about how worthless it is to put in a fresh CPU in an older laptop, it might very well not be. Make sure it’s built for it and works with what you get and don’t break the bank hoping to make it super modern, but making it “best in it’s class two-three years ago” is totally doable. This one is now a little unbalanced, the built in disk is a bit slow and 4 Gb RAM is kind of tight, bottoming out the RAM to 8 Gb and putting a solid-state in the cd-bay for the OS would even it out and boost it a lot but those are $60-$120 jobs (each) while this was the cheapest and simplest improvement I’ve made to this laptop so far. I’m thinking it might last a little longer yet, supposing it doesn’t go so long that what people will give you for free if the screen is smashed or the CPU fried is better.
Reassembled
CPU-Z screenshot

CPU-Z screenshot

Started working on I2C with my new Bus Pirate

A Bus Pirate connected to the pins of a Kindle 3 battery

A Bus Pirate connected to the pins of a Kindle 3 battery

I’m trying to get into hardware some more, because it’s kind of cool and a lot of things I skipped over because “I’m a software guy” I’m starting to realize I really skipped because “I think that looks hella hard and I already look pretty cool doing software” even though I wanted to know about it. So I bought a Bus Pirate.

Just to get it out of the way, I got it at [Seeed Studios]. They’re made by [Dangerous Prototypes]. The version linked here has a counterfeit 232 chip (for the serial communication) so you’ll need an older set of FTDI drivers, version 2.08.28 (*NOT*, and this is important, 2.08.30 or 2.10 – they only work with legit ICs) which I got [here]. The page is unrelated afaik. Some may have non-counterfeit 232s and work with regulars, but mine doesn’t and I’m not thinking seeed has switched batches because they’re still on a batch from early 2012. There’s another version (4 instead of 3.6) of the bus pirate itself, which costs slightly more (not much) and does slightly other things (comparing is so over my head right now). I’ve also heard later drivers mess up under windows 8.1, but the 2.08.28 still works. I’m under 8.1, 64-bit, and they work fine fake chip and all. Beyond the driver thing, there’s apparently nothing much wrong with the counterfeits (hey, it’s all from china anyway, if anything counterfeits are slightly more likely to be made by small groups working for themselves on something resembling their own terms) and the later swaps apparently use a legal chip but one that’s slightly worse, so don’t fret too much over it function wise. I went with [TerraTerm] on Dangerous Prototypes suggestion, but a later version as I’ve installed drivers for ~1 zillion cell phones to work as serial devices and the older 3.11 version only goes up to COM12. The windows terminal will allegedly work but give you issues – I didn’t bother to find out because it’s a pretty sketch terminal anyway so I have no trouble believing the rumors and it’s not installed by default in win 8+ anyhow so something would be getting installed either way.

So if you were looking for links there they are. The Bus Pirate is a tool for reading, writing and evesdropping on a variety of protocols hardware devices use to talk to each other or, more commonly, parts of themselves. I’d heard about these in passing in electronics class in high school, but dismissed it as kind of useless unless I was building massive devices (wasn’t planning on it). However, *modifying* massive devices (by massive I mean like.. a microwave. Or a router. Or digital watch. You know, so much bigger than what you could reasonably design and build on a basement workbench. Unless you have mad ninja electronics skills, in which case good for you) is another matter and these days they’re harder because they’re all tiny and integrated but actually in some ways easier because they’re super modular. If, say, your microwave needs to time a minute, displaying the countdown and then beep, it doesn’t have a timer, a complex set of gates to translate it into LEDs lighting up and a little oscillator circuit that when powered puts a tone out a speaker. Well, it does, sort of, but they’re packed into chips. So it’s more like a microcontroller (system on a chip or a cpu and a small memory chip, like a tiny computer), a display driver (perhaps built into the display) and perhaps a buzzer thing by itself or a direct processor-to-speaker connection. No one built the whole thing, seperate people built each piece and someone (or someones) assembeled the end control circuitry like lego, picking up the pieces they wanted and making them talk to each other. So if you want to mod it, you don’t have to dive into the depths of it either, just figure out what each part does and modify it on that level. Just like software, hardware has become more and more high level – it’s cheaper to make a general purpose circuit and then use it for tons of stuff (even when it’s overkill) than to build custom circuits for everything. Some special things still need the low-level optimization touch, but mostly they don’t. So all that’s needed is a way for all these things to talk.

I’m trying to start with I2C. It’s a [protocol developed by Phillips](http://en.wikipedia.org/wiki/I%C2%B2C) sometime long ago (actually like early 80s). The basic idea is to transmit data between two things over two wires, one clock, one data. But it’s built to have lots of things connected, so they all have to be able to talk and not talk over each other or mess things up when they’re quiet. This is accomplished through two particular things. Each connected thing has an address (between 8 bit, so you can have 255 minus a bunch of reserved ones things connected at once). When communicating, no one ever tries to force either the clock or data line to have power on it, instead they remove power from it by connecting it to ground. One thing (usually the master or one of the masters if there’s several though that isn’t that super common, which also delegates who gets to talk and when) or a separate connection has both lines connected via a resistor to power, meaning it usually sits there floating calmy at five volts. If anyone talks, they ground it, making it 0v with a small current flowing into ground because of the resistor. Think of it as a rope along the side of a train you can pull to signal “stop at next stop”. When everyone is saying nothing, the rope just sits there.

Anyone can, however, pull it and now everyone sees the rope go taunt (should they bother to look) until it’s released. Everyone can look and, at times, interject with a pull – it doesn’t matter how many of you there are looking at it. In order to avoid lots of people pulling at once, one module (or more, but usually one) is the master. If it wants to hear from someone, it’ll yank both the clock and data lines low in a certain space of time. When this happens, all other modules pay attention to what happens next. The master either pulls the data low or leaves it high, then pulls the clock low and releases it. The high/low on data line when the clock moved was a bit. There’s lots more formatting to it, but in this manner the master bleeps out (with data pulls/no-pulls and repeated clock pull/release) an adress and if it wants to speak to it (like “I want to set bytes 1 through 8 on the display to what I’m sending next”) or hear from it (like “I want the whomever is the thermostat to state the current temperature”). Supposing it panned out, said device will pull on the lines in response (called an ACK, for acknowledgement) meaning “I heard that and I’m prepared to do that”. After, it and the master can use the two lines to signal (either to the master or slave device depending on which direction was specified) until whatever data it was came across. Once done, another specific clock/data yank (timing wise) is done to signal “Ok everyone, that’s all done now, we’re back to leaving the lines alone and listening for next time”.

It’s a lot more complicated than that and feel free to read over it, but the Bus Pirate (and lots of other things including all I2C compatible components) already implemets it so a general idea is really plenty. It’s perhaps noteworthy that it’s a somewhat fragile protocol – if one of the components fail in such a way that it grounds one or both of the lines, everything is now broken and no one can talk to anyone. Because of this many devices have more than one I2C bus (set of two lines) to keep critical systems safe even if lesser systems fail. For instance, in a car (which often indeed use I2C or a derative of it) often has a separate bus for comfort stuff like “Open the window” or “turn the air conditioner off” and another for “move transmission to reverse” or “adjust iginiton timing to these values” as one is more important than the other. Even so, it’s often quite muddled at this level – many things (yes, including cars) in fact have buses mixing quite a lot in terms of criticallity and purpose. It often gets that way when one bus is heavily loaded an one not used much, after all the end consumer is unlikely to know about it until it fails and even then they usually won’t know *why* it failed or see that another six foot of copper wire could have made the failure way less severe.

But anyhow.. I want speak it. So I installed the above serial drivers and hooked up the little board. Started teraterm, set the port (under Setup->serial port) to 11500 bps, 8 data bits, no parity, 1 stop bit, COM16 (the new serial device that popped up in Windows Device Manger) and pressed enter. The Bus pirate responded “HiZ>”.

Actually I did no such thing, I tinkered with failing drivers, messed up terminals, broken USB cables and so forth for ages. But if it had gone right that’s what would have happened. By “HiZ>” it means it’s in high-impedence mode, meaning all ports are measuing if they’re high or low (in some cases the exact voltage on them) and putting out no power itself. It’s the default setting at it’ll disturb nothing – if it’s connected to something powered it won’t short it to ground, if it’s connected to something grounded it won’t try to pump power into it. It’ll just sit there looking at things, not harming itself or others. The “>” is because it’s a prompt, you can type stuff to it. “?” for instance will give:

General                                 Protocol interaction
---------------------------------------------------------------------------
?       This help                       (0)     List current macros
=X/|X   Converts X/reverse X            (x)     Macro x
~       Selftest                        [       Start
#       Reset                           ]       Stop
$       Jump to bootloader              {       Start with read
&/%     Delay 1 us/ms                   }       Stop
a/A/@   AUXPIN (low/HI/READ)            "abc"   Send string
b       Set baudrate                    123
c/C     AUX assignment (aux/CS)         0x123
d/D     Measure ADC (once/CONT.)        0b110   Send value
f       Measure frequency               r       Read
g/S     Generate PWM/Servo              /       CLK hi
h       Commandhistory                  \       CLK lo
i       Versioninfo/statusinfo          ^       CLK tick
l/L     Bitorder (msb/LSB)              -       DAT hi
m       Change mode                     _       DAT lo
o       Set output type                 .       DAT read
p/P     Pullup resistors (off/ON)       !       Bit read
s       Script engine                   :       Repeat e.g. r:10
v       Show volts/states               .       Bits to read/write e.g. 0x55.2
w/W     PSU (off/ON)            /<x= >/   Usermacro x/assign x/list all
HiZ>

That’s kind of a mess of options, so lets not worry too much yet. To make it interact with I2C, I type “m”, getting:

1. HiZ
2. 1-WIRE
3. UART
4. I2C
5. SPI
6. 2WIRE
7. 3WIRE
8. LCD
9. DIO
x. exit(without change)

(1)>

and “4” to choose I2C:

(1)>4
Set speed:
 1. ~5KHz
 2. ~50KHz
 3. ~100KHz
 4. ~400KHz

(1)>

It’ll ask for a speed. I’m planning on talking (at least verify I can talk) to a battery. I have no idea what it speaks as such, so I go with 2 rather randomly (happened to work in this case, could have gotten no data in which case I’d try something else).

I now connect some of the pins on the board to what I want to talk to. If I had a probe kit (also available from seeed) I’d just connect, but I don’t because it’s still shipping from shanghai or something so I connected a random piece of wire from an old computer to it instead an labeled them. The board clearly (and somewhat helpfully) labels all pins on it, but many are used for more than one thing. In this case, the Clock pin (the up/down per bit one) is labled “CLK” on the board, on the battery it’s “C”. The Data pin (current bit being transmitted) is D on the battery and MOSI on the board. Yeah, for a while I thought it’d be AUX, which it isn’t. MOSI is really a term from another protocol, meaning “Master Out, Slave In” but when in I2C mode, that’s the data pin. Also, the battery has a 3.3v (ish) out, which I connect to 3.3v on the board, and a ground which I connect to GND.

Not done yet though. Remember how something has to hold these lines at voltage when not in use? Yeah, no one is doing that. But it has the option of doing it – one of the pins is labeled “VPU” for Voltage Pull Up. If I type “P” (capital p, lowercase p turns it off) it’ll connect that pin to the appropriate lines to hold high, via a resistor. So I connect that to the 3.3v as well since that’ll be what the battery considers high. Just for kicks, I connected the ADC pin (Analog/Digital Converter, by default it plain measures the voltage on it up to 6v) to it as well. So set the pullup and do a “v” to give a readout of the current state of the pins.

I2C>P
Pull-up resistors ON
Warning: no voltage on Vpullup pin

(as an aside here, I’ve yet to actually connect any of the wires above and the 5v and 3.3v are still turned off as is the default. So it’s right – the pullup resistor I just connected has no voltage on it)

I2C>v
Pinstates:
1.(BR)  2.(RD)  3.(OR)  4.(YW)  5.(GN)  6.(BL)  7.(PU)  8.(GR)  9.(WT)  0.(Blk)
GND     3.3V    5.0V    ADC     VPU     AUX     SCL     SDA     -       -
P       P       P       I       I       I       I       I       I       I
GND     0.00V   0.00V   0.00V   0.00V   L       L       L       L       L

(Yup, zeros and lows as expected. I connect the four pin-bundles)

I2C>v
Pinstates:
1.(BR)  2.(RD)  3.(OR)  4.(YW)  5.(GN)  6.(BL)  7.(PU)  8.(GR)  9.(WT)  0.(Blk)
GND     3.3V    5.0V    ADC     VPU     AUX     SCL     SDA     -       -
P       P       P       I       I       I       I       I       I       I
GND     4.06V   0.00V   4.09V   4.06V   L       H       H       H       H

(Woho! The battery is registering! As it’s a litium ion, it’s a bit over 3.3v, but that’s ok. As the pullup resistor is sufficiently large and not just a short to GND, it’s not violently catching fire as it overheats and sprews elemental lituium which is nice since it’s on my lap. It’s probably not that drastic in reality – it most likely has it’s own overcurrent protection and I’m only holding the wires in place with my hand so I’d yank them at once. I invoke “(1)” which is a macro to issue a read and a write order to each address possible, issuing a “done” in between regardless of answer)

I2C>(1)
Searching I2C address space. Found devices at:
0xAA(0x55 W) 0xAB(0x55 R)

I2C>

Double woho, it speaks! Apparently it considers 0xAA to be it’s name (0xAB for reading). Next step, attempt to find out what it says either by listening to the device pretending to be a battery, the battery pretending to be a device or just between them evesdropping. But that’s enough for now I think.

In case you made it this far, I’ll go ahead and mention that the battery and device is a Kindle Keyboard (aka Kindle 3, K3). My hope is to figure out the talk between it and the device as a first step to kindle 2 and kindle 4s batteries. I don’t have functional batteries for those and would like to program something to pretend to be one to run it off another power source.

Review, Chasetac USB 3.0 ExpressCard

My cd drive broke. I hadn’t used it for ages (though suddenly missed it twice since – go figure) and realized it’d be better to connect my external hdd to it’s port (laptop,tight on that). Check if that’s really possible (yup, there’s even special docks shaped like a Cd drive for an extra Tb hdd or few 100 Gb solid state) and find people hooking up sound boards, desktop size pcie 1x contollers, graphics cards (barely, external power and tight bus) with the mPCI-e port you usually use for internal wifi and expresscard ports. I notice I have an expressport. Awesome! But.. too pricey. Decide to check what they want for the sata cabling, if I solder it from parts it should work but could be sketch. They have them, but around $15-20 and always in parts (lots for internal only). There were some separate sata controllers for the pci-e ins. And, cheapest of all at $12, a two port usb 3.0 card. I vaguely remember seeing that my external supported that. Sold.side

So after that run-around-y way of getting, there it is. I’m thinking this is kind of a niche product, but fortunately I’m it. It is (as you can see) a little on the large side – probably would need to be disconnected for moving. I already do, as it’s hooked to an external drive and it doesn’t move that much anyway. It needs power (can be from a a usb port which may be an issue plus losing a port, but no worries – I’m glad it follows spec and it can be fed with a random 500mA dongle instead of usb.closeup

Installation, however, was flawless. Plugged in, waited a moment, done. I’ve only (predictably) tried one device but it worked really well. Here’s data.smbetter

As you can see, I’m hardly bottoming it out here, it’s only a bit past what USb 2.0 should do in a perfect world. You can also see, though, that it very much was doing nothing of the kind. It’s a pretty cheap drive, so I’m thinking this is probably about what it’d do regardless of controller – the bottleneck is at the drive itself at this point. I’ve used another similar drive outside of the enclosure when it was failing and I was trying to rescue some data and while I didn’t really run benchmarks, it ran about like your average HDD, no slower or faster, even hooked straight to a fast SATA II.

Very satisfied with this gizmo all in all, hoping to carefully pick out and add another USB 3 device when I figure out what would be worth it. Do keep in mind to check what your machine has available when expanding or adding on – I’d never really considered anything but USB (1 and 2) figuring.. well, it’s a laptop.. but when it came down to it there’s *two* nice chunks of PCI-e and a whole SATA port for the unused CD player. I could hook up four more of these before there was any reason to consider the (crawling) speeds of USB and even then, there’s also an ethernet port (used, but a networked drive on the lan is still faster than usb2). A friend of mine mentioned doing something similar today and putting in SSD after realizing there was an mSATA port with some space on it. Could be the CD bays fate if I come into money.

OpenWrt on Vizion WR-100 (XWR100) behind a NVG510 modem

This is sort of a second attempt at this. I got the XWR 100 the other day and decided to root the NVG510 (AT&amp;T version) to enable bridge mode (i.e. do only ADSL2+ and pass the data letting the router do authentication and any routing) hoping to get more control over the protocols that confuse it (most do). It worked, but then promptly went south. I’m not sure why, but mostly likely it did something on top of just the DSL like trying to get itself an IP. I guess how well that setup in general works for them is still debated with some having no issues and some claiming it’s super fragile, it’ll probably keep going like that until the last nvg510s are finally gone.

After sorting things back to normal(ish), I decided to instead go with a lighter IP passthrough which accomplishes roughly the same thing. I also wanted the XWR100 to run OpenWrt though. OpenWrt is a third party open source firmware for routers (mostly, really anything embedded) which can do a lot nicer things including a lot of things that aren’t related to routing. I had to quit using it a few years ago lacking a good way to set it up with a different modem, but wanted it back now that I’m putting a better router in anyway.

Looking at their lists there is a version for it, but they only distribute source. I didn’t particularly like the idea of building it. Or rather I like the *idea* well enough, but I’m not so positive I’d get it all correct and since it’s flashed to the memory of the router getting it wrong can make it hard or impossible to fix again. After looking around, I found a build from around the same version (here’s a copy) and when not finding much else except building I went with it.

Since it’s a factory type image, it can be flashed from stock firmware just by itself from the stock firmware (some are only for upgrading from an existing install or need an initial ‘crack’ flash to exploit past blocks on flashing it. This appears open to start with). It’s also possible via u-boot, it’s installed bootloader, but I didn’t look into that past that it seemed possible supposing the original flash fails. Besides “be sure you have the correct firmware”, the only other solid advice seems to be to make sure to do it over a regular cable (no wifi or such) and to do a factory reset before and after flashing. Seems simple enough.

Doing a factory reset on this router is the same as on many others, what’s known as 30-30-30 – hold down the reset button for 30 seconds, disconnect power (still holding the button), wait 30 seconds, reconnect power (still hold it), wait 30 seconds. I think it was first the method on some broadcom chip based router, but then all of them and now on quite a lot that have other chipsets entirely. If it sounds like you’ll be holding down a reset button for a full minute and a half, you’re right. Try to take it as a lesson in how impatent we’ve grown with life in general or something, but do suck it up and do it – it’s not that long and it clears some data from it that could otherwise cause major issues (among a whole slew of other more or less likely problems).

Back of a Vizio WR-100

Here is what the back of the router looks like. The little red dot inside the hole next to the power on the right is what needs to be held down. I broke off a piece of a matchstick to hold it down with a little more comfortably and did exactly as described. The light next to the power button will blink a bit when first pressing it, then stop but start again around 30 seconds. Don’t count on that, just time it. After the second 30 and replugging it, it’ll start blinking off and on, then stop and again restart a little before the final 30 is up.

holding

After that, plug in a computer (as on the picture) into the port labeled “1” among the four labeled “Ethernet”. After a brief while, your computer should get an IP address and claim to be on a network. It will likely point out that there is no internet through it, which is true. If you’re on a PC, you can open a command prompt (start->run->”cmd”) and run ipconfig (just type it and press enter) to list some stats about the network connections. Under the Ethernet adapter, it should list an IP address, like:

Ethernet adapter Main Ethernet:
Connection-specific DNS Suffix . : lan
Link-local IPv6 Address . . . . . : fe80::8837:983f:4165:8c2b%12
IPv4 Address. . . . . . . . . . . : 192.168.1.64
Subnet Mask . . . . . . . . . . . : 255.255.255.0
Default Gateway . . . . . . . . . : 192.168.1.1

If the Default Gateway isn’t 192.168.1.1, something is wrong here as that’s what the router would default to being and give you an IP somewhere in 192.168.1.x. You can force it to try to ask again by doing “ipconfig /release /renew” or unplug it, wait a second and plug it back in. Once it’s noticed it, open a browser and go to 192.168.1.1. It should pop up the normal start screen.
Router startup screen
Clicking the center symbol for router and the “advanced” button next to “Firmware version” goes to the firmware upgrade screen. Most things in this interface is placed kind of at random.
Clipboard01
Since I have a file, I’ll click “Choose File”, pick my .bin from before and click start.upgradeThe this will take a while. And sometimes, it’ll pause a bit. Do not, however, mess with it! It has to finish on its own, say it’s done and then sit there and flash its lights for a while until it’s completely done and all has been written to the flash. waiting

You will get extremely nervous doing this. Several times, you’ll be sure something has gone wrong. You’ll be unable to convince yourself it hasn’t. Moral support from others or practice doesn’t help. The normal procedure is to just sit there all jittery peeking at all the things going on (very few), looking away for a split second sometimes but then back until it’s finished.

Once done, 30-30-30 again. Refresh.
openwrt login screen

Tada! The root password is admin, needs to be changed. The interface for setup is pretty average looking, a little less fancy in terms of pretty pictures but all the “normal” setup normally there telling it to either take over the modem (if it’s in bridge mode) or receive an IP from the modem and run over it. I set the modem to IP passthrough (assign public address to specific MAC, the one of the router), giving this router the public IP although the modem still handles authorization – at&t insists it appears – and thinks it might give out other local IPs but nothing is hooked to is so that won’t actually happen. I ended up just assigning it static to what the modem has been assigned since a year and a half ago because I wanted to bypass the DNS setting.iface-setup1

I could technically do that another way, but I’ll do it later – so much to explore. A small confusion was the modem won’t switch it’s own IP or consider another range, so this one had to scoot over to 192.168.2, but no big deal. I didn’t find the place for enabling the WPA, but it turns out the tab was right there in the wireless settings and I managed to miss it somehow. Even with that, I could ssh into it and modify it with their Unified Configuration Interface, a database accessible from the command line with “uci” followed by show, list, set, etc to modify most of what is running. There’s also opkg (also under software in the web interface), a package manager that can download and install tons of various upgrades like providing and using VPNs, most of the major transfer protocols if I hook up a USB drive to it (otherwise too, but there’s only eight megs of storage..), run a phone switch or plain voip phone and so on. There’s such piles of stuff, most of it very small and extremely tightly written too. I’ll probably continue to explore for a while, though before I even started I wanted a list of the packages and just typed “ipkg list” (it’s similar to debians dpkg, the other ipkg, etc). I wanted it on my side so I could browse it outside command prompt and then realized I could really just do “opkg list >/www/opkg.txt” and load “192.168.2.1/opkg.txt” into the web browser to view it. And why couldn’t I before? It’s mindblowing what small freedoms a stock router (which 99%+ of the time runs linux too, the only thing is it’s hard to get into because it’s locked and when you do, it’s not very built out because it’s just one specific model that doesn’t work with anyone elses projects, not a well-used platform tons of people write for).