If you look at a temp gauge troubleshooting guide, the explanation of how they work is fairly simple.
- 12 volts runs to one terminal of the gauge, then through the gauge and through the wire to the sender in the block.
- The sender is just a big variable resistor, and it varies bases on temperature.
- When the engine is cold, the sender resistance is high, and the gauge reads cold
- When the engine is hot, sender resistance is low, and the gauge reads hot
First test - Key on, sending unit wire grounded (zero resistance), gauge goes to max hot
2nd test - turn key on, put high resistance between sending unit and ground (typical would be 1500 ohms), gauge will read near zero
If these basic tests work, then you need to know the manufacturer's table of resistance vs. gauge reading. For example, this post says that these are the typical values for a GM temp gauge
http://www.gafiero.org/bbs/index.php?topic=641.0 Based on the table, a 185 ohm resistance between the wire and ground should equate to 210 degrees on the gauge, while 1600 ohms is 100 degrees.
You can buy resistors from Radio Shack or any electronic supply place, and just string them together to get the resistance you want. Resistance is additive, so wiring a 1000 ohm resistor in series with a 500 ohm resistor would give you 1500 ohms. Do not put them in parallel, since resistance no longer adds up.
My plan is to check the calibration on my gauge; however, mine does not have numbers. Since I would like 200 degrees to read a little under the center line, I will just calibrate it by popping off the pointer and making it read there with 225 ohms resistance. Then I'm going to figure out what tick mark it is at with 120 ohms (230 degrees), and consider that as the max I ever want to see in normal driving.
Bruce