The purpose of the resistor is current limiting.
In a low impedance circuit, the current can indeed increase well beyond the limits the load device can handle, especially with a low impedance, high bias voltage device like a semiconductor. This is especially true of something like a free running transistor, which can experience thermal runaway if the impedance attached to the collector is too low. LEDs can indeed be susceptible to a similar effect, although to a much lower extent than most semiconductors, because their junction impedance is much higher. A normal diode's only method of dissipating energy is to produce heat. That heat decreases the junction resistance, which in turn increases the load current, which increases the heat... Typically a diode junction (that includes the emitter-base junction of a transistor) will only require a few hundredths of a volt change between the onset of conductance and the maximum permissible load. For most silicon diodes or transistors, that range is near 0.7V. Very high power semiconductors may range all the way up to 1V or just a bit above. LEDs, however, dissipate most of their energy via light, rather than heat. That is why they are so efficient. Thus, as the voltage across the LED increases, instead of getting a lot hotter, it gets brighter. Of course, there is still some internal junction resistance, so it does get warmer, and of course will eventually burn out if the current causes the heat to exceed the dissipation capabilities of the device. For a single LED fed by a low impedance power supply capable of delivering both more current and voltage than the LED can take, some sort of limiter is definitely a good idea. A resistor is the easiest and cheapest solution, but it is not very efficient. Much of the power provided by the source is dissipated by the resistor. For example, if the LED is being fed by a 12V supply and is designed with a limiting resistor that produces a 3V bias on the LED, then 93% of the power delivered by the supply is dissipated by the resistor, resulting in a 7% efficiency. That's about as bad as an incandescent light. If, however, the power supply voltage is less than the maximum forward voltage specification (typically around 3V - 6V) of the LED in question, then no limiter is necessary. That, or if the supply has a high output impedance, buts such is not the case, here.
In the case of the string of LEDs, the situation is even milder. First of all, the designer will create the string with enough LEDs so the voltage across any LED given a 160V (or 320V) peak input voltage is less than the maximum allowable forward voltage of the LED. Secondly, the load impedance of, say, 30 LEDs in series is 30 times the impedance of a single LED. Any limiting resistor generally has to have a greater resistance than the active load impedance - generally much, much greater - in order to be effective. Inserting a resistor that large into the LED string would make the LEDs so dim as to be completely useless, unless your AC power supply could produce something on the order of a Kilovolt.
There's no resistor in there.
LED's do not have that capability and as soon as the voltage exceeds the voltage drop of the LED, the current would take off and fry the LED's if there wasn't something there to control it.
See above. They certainly do have the ability as long as the input voltage is not greater than the maximum forward voltage spec. To the circuit, they look like a medium sized resistor. A regular diode looks like a very small resistor.
The reason for 3 wires is that the string is in a series parallel circuit. One wire is for the resistor output going to the LED's just past it and the other 2 wires are the common for the string and the full line voltage going to the next group of LEDs where you will find another cylinder with another current limiting resistor.
Then there would be no reason for two wires to go into the box or 3 out. The resistor only needs, or indeed can, to be attached at two points. Neither of the bypass leads would need to touch the resistor, only the lead attached to the input of the first LED. There also would be no reason for a second, let alone a third, resistor.
What would be any possible advantage in doing that? The LED's are themselves half wave rectifiers.
Do a Google search for "full wave rectified LED", and you will see a number of vendors explaining the advantages of full wave rectified LEDs. In short, a half wave light turns on and off 60 times a second, being at high brightness levels for less than 10ms, and at low brightness for more than 7ms, plus being completely off for 17ms. That means it is only really effectively on about 29% of the time. A full wave rectified LED first of all produces twice the amount of light (about 40% brighter to the human eye). It also flickers at 120Hz, not 60 Hz. A 60Hz flicker can be perceptible to the human eye in many cases, especially when the flicker is from full bright to full dark, and the dark period is more than half the total period. A 120Hz flicker is not distinguishable to the human eye except in a few very special cases. What's more, the LED spends more than 60% of the time illuminated, with the low brightness times being less than 7ms out of 17.
The LED's are themselves half wave rectifiers.
True, but one still must have a pair of diodes in each cylinder to make it a full wave rectified string. You need to draw it out to see why, but the next to last diode in the segment going in each direction has to be pointing the opposite way from the LEDs. To the point, most LEDs, however, do not have very high reverse voltage specs. A voltage spike in the reverse direction could fry them. If I were the designer, I would put in a pair of forward diodes along with the LEDs on each segment even for a half wave string. A conventional power diode with a maximum reverse voltage of a couple of kilovolts is still very cheap.
Please explain how the output of a 110V (actually more like 120V) is 156 volts.
AC power feeds (like the one to your house) are specified as RMS voltages. The voltage of an AC signal varies over time, so one cannot speak unambiguously of the voltage of an AC signal without employing some time related function whose values are specific to the properties of the signal in question. For power related applications, however, all one is really wanting to know is how much power the circuit is capable of delivering. We can assign a simple parameter to the signal by taking the amount of power the signal would deliver to a resistor of a given size and then calculate the DC voltage which would be required to deliver the same power to said resistor. Doing so sidesteps the need to deal with the actual waveform. This is known as the RMS voltage. For a sine wave symmetrical about 0 Volts, the RMS voltage is equal to the peak voltage (half the peak-peak voltage) divided by the square root of two. Thus, a sine wave power feed with a peak voltage of 141.42 volts (288V peak-peak) has an RMS voltage of 100 VAC. Our power lines typically deliver a voltage to the house of between 110 and 125 VAC. The peak voltage consequently ranges from 156 to 177 volts. I took the lower value as the example. Half wave rectification if done in a particular fashion could actually produce a DC signal of 332 to 354 volts, but with a horrible amount of ripple. Full wave rectification will produce about 154 to 175 volts, after allowing 0.7 volts drop across each diode.
Go put an oscilloscope on your house power or rig up a full wave bridge and measure the output with a voltmeter and you will see. Take great care. House current is quite dangerous.
This is complete crap.
I have been a professional engineer for 35 years, and prior to that I worked in the physics lab for four years designing and maintaining (among other things) high power circuits while I was getting my degree in Physics. I assure you, it is not crap.
voltage drop on each LED would exceed the input voltage if they were all in series only.
If the number of LEDs drops or the input voltage rises to a few volts above the point where the string will carry essentially no current at all, the string will not suddenly burn up, trust me. Again, as long as the maximum input voltage does not exceed the specification for maximum voltage across the LEDs, they will be fine with no other consideration. After all, if you think about it a minute, that is all the limiting resistor does. If you tied them into the 13KV line feeding the power transformer servicing your house, then you would need a limiting resistor. A HUGE one.