I’m torn between using 3.3 and 5.0 volt components on my microcontroller projects. Many people seem to be moving to 3.3V, but components don’t seem to be as readily available as 5.0V components. What are the advantages and disadvantages of each?
Please log in to post an answer.
Why switch when you can easily “translate” or convert 3.3V logic signals to 5V logic signals, and vice versa? Sparkfun electronics sells small boards with level-conversion devices ready to go. I prefer Texas Instruments SN74LVC4245 24-pin small-outline IC (SOIC) devices because they can translate eight signals independent of each other.
You’ll need two of these ICs, one for 3.3V to 5V logic, and another for 5V to 3.3V logic. You can mount these SMT ICs on Schmartboard SOIC-to-DIP adapters. Other suppliers offer similar products but I find the Schmartboard adapters easiest to solder.
In my view, that is not really an up front issue. If I want to build a logic circuit, I have an idea of what I want it to do. Than I start looking for the required parts, making a list of what parts would work, and some of the basic specs such as its operating voltage, cost, footprint, availability, etc.
Sometimes most of the required parts are only available in 3.3V, and the decision is made. Sometimes they are available in 3.3 and 5V, but the 3.3V version is some difficult package I can’t solder. Sometimes I can’t find all required parts in the same voltage, and I end up implementing a logic level shifter, usually simple to build with 2 transistors and a couple of resistors.
Point I’m trying to make is that the decision to go with 3.3 or 5V is usually dictated by the design, not a starting point.