|
| LinkBack | Thread Tools | Display Modes |
12-02-2007, 09:44 PM | #1 |
RCC Addict Join Date: Aug 2005 Location: VARCOR
Posts: 1,826
| Soldering Stations............Wattage vs Temp
So, I am thinking about making my third "true" attempt at battery soldering, so I starting doing a little research here. My first couple of attempts were just nasty and cold. From reading, I need a 40+w iron. Now, I thought I had the soldering station thing covered, seeing as I have been using the same Weller Station (WTCP2) since I was probably 12, and have soldered hundreds (if not thousands) of joints in my lifetime with this trusty station. I thought I had an ol' school machine that a modern station couldn't even touch as this thing has done me so well for EVERYTHING I needed to solder. But seeing as I am looking to build some batteries, I thought I would revamp my technique and tools................and from looking over ol' trusty, it is a 30W 0-450* station, which by today's standards, is just quite small. Heck, I usually only cut this thing up half-way (about 300*) for quick joints on motor terminals and such.............and only 1/3 on simple contact points. So, I start looking into a new station, and start looking at the newer Weller 40w and 80w stations................but from what I can see, both are 900* stations. Obviously, given the same 120 voltage, the 80w is pulling more amps, but why if it is producing the same tip temp? Isn't tip temp the MAIN factor here instead of wattage? Why are soldering guns/irons/stations measured in wattage? Seems kinda like vacuum cleaners being based on amperage, which really means nothing without including motor efficiency. Thanks! |
Sponsored Links | |
12-03-2007, 05:07 AM | #2 |
Rock Crawler Join Date: Dec 2005 Location: West Omaha
Posts: 581
|
It is more a matter of load. They are the same temp but the 80w will not cool as fast. When you use it it will not lose heat to the object you are soldering as fast.
|
| |