The technology to derive substantial electrical current using light from the sun has been around since the mid 1950’s when the first solar cell was created by Daryl Chapin, Calvin Fuller, and Gerald Pearson at Bell Labs – they developed the first solar cell capable of generating enough power from the sun to run everyday electrical equipment. A silicon solar cell was produced that was 6% efficient. They were later able to increase efficiency to eleven percent.
Anyone who is aware of the ability to harness sunlight into electrical energy just has to recall from Jr. High School Science Class that Electricity produced by a solar cell is only good if the sun is shining directly onto the a photovoltaic solar cell.
With basic knowledge that solar cells product Direct (un-fluctuating) current, it stands to reason that there are two, very costly obstacles that stand in the way of practical solar power: 1) how to convert the current from direct current (DC) to Alternating Current (AC) so that it can be used in the common household and 2) how to practically store the energy for use when needed after the sun had set or gone behind the clouds Solarlight.
By the time solar technology had developed and become less expensive to produce, our nation’s infrastructure had already established and built around the standard of AC at 110 volts and 15 amperes. A big expense to the use of solar cells is the requirement for use expensive power inverters to convert it from DC to AC.
With help from Exxon Corporation in 1970, a significantly less costly solar cell was designed by Dr. Elliot Berman. His design decreased the price of solar generated power from $100 per watt to $20 per watt. Although, still costly, this was a giant leap into the feasibility of the use of practical solar power