What I'm really trying to figure out is how do I make a simple circuit to send a signal that is between 1ms and 2ms with a thumb-stick control with 0 position being 1.5 ms, 2ms being forward, and 1ms being reverse.
The simplest is a small microcontroller of course - the potentiometer can hook directly to an A/D-C input and a digital output is all it takes to control a servo - a PIC10F220 could be glued to the pot itself and do all the work.
If long cables are used to transfer the signal to the servo, it might be a good idea to use a transistor to bring the impedance down, to keep interference from noise away.
[...] a potentiometer(I assume this is what a thumb-stick would use).
Most cheap joysticks use a potentiometer and I guess that's what you're talking about(?)
And calculate the potentiometer needed for a change of resistance that give exactly a wave from 1ms to 2ms. I hope this helps in understanding the help I'm looking for.
There's no such thing as "exactly" in the real world. Youy have to specify any measure with its allowed tolerances.
Like from 1.000ms +/- 5µs to 2.000ms +/- 5µs or whatever you need.
And even though the standard says 1000µs to 2000µs, some servos goes as low as 300µs and some as high as 2800µs
The closest tolerances can be reached with an x-tal controlled microcontroller.
Fairly good can be had with a microcontroller with internal oscillator
Usable, if you can get over the "exactly" part, can be done with a 555.
I'd go with option B, but I have no idea of your skill level.
Some things to consider:
Most joysticks only use a part of their potentiometer track, so min/max values should be measured.
A microcontroller program can be made with automatic calibration to the potentiometer min/max values, as well as to the servo min/max positions.
In a PDM signal for a servo, the period is not very critical, but the pulse duration is, as deviations of around 5µs and up will give jitter.