Taking out the 220V/110V step-down transformer, the same energy is used by the (same) plasma whether you use 110 or 220V. The main difference is that with 110 V, the appliance would draw twice the current that it would from a 220 V supply line. The product of V (voltage) x I (current) x efficiency is P (power), which is the energy an electrical appliance needs to run on per second, remains the same. If anything, a 220V/240V distribution should be safer because of the lower current flow - some American standards are technically questionable - but because they are America - they tend to become standards.
However, the voltage transformer (AVR) inserted between the outlet and the plasma contributes some additional power load on its own (the idle state load + the running load). If it's highly efficient, then it's not gonna be that pain in the wallet. If it's not efficient enough, it will most likely burn out anyways due to heavy load of the 58V plasma. So the answer to your last question is the contribution of the AVR.
Your 58 full HD plasma alone is rated 635W at full load, but annual power consumption at 4.5 hours viewing per day is estimated at 475kWH which translates to about 40kWH per month - not so bad for someone who can afford a 58V in the first place.