Skip to main content
edited tags
Link
Circuit fantasist
  • 21.3k
  • 2
  • 24
  • 79
added 67 characters in body
Source Link
Marcus Müller
  • 111.2k
  • 5
  • 152
  • 293

I want to measure distance by electromagnetic wave travel time.

Capacitor charging voltage on time dependency is well-known:

\$V_{capacitor} = V_{source}(1-e^{\dfrac{-t}{RC}}) \$

Equation above is often used to get voltage from given time.

But, we can use this equation to get time from given voltage.

Theoretically, if we know source voltage, loop resistance and capacitance, if we measure voltage at capacitor, we can get time.

My question, is how high the precision of the measurement can be? Can the accuracy be about picoseconds?


If we have a loop with 5V source, 1 Ohm resistance, and 1 nF capacitor, every 3 picosecond (upto 3RC), the voltage on capacitor will increase on about 14 mV, which is pretty easy to measure.

Then, we off the source voltage or even not, i.e. asynchronously use connected voltmeter chip, as I understand it is called “Analog to Digital Converter”, get current voltage on capacitor and pass data somewhere to microcontroller.

Should it work?

Capacitor charging voltage on time dependency is well-known:

\$V_{capacitor} = V_{source}(1-e^{\dfrac{-t}{RC}}) \$

Equation above is often used to get voltage from given time.

But, we can use this equation to get time from given voltage.

Theoretically, if we know source voltage, loop resistance and capacitance, if we measure voltage at capacitor, we can get time.

My question, is how high the precision of the measurement can be? Can the accuracy be about picoseconds?


If we have a loop with 5V source, 1 Ohm resistance, and 1 nF capacitor, every 3 picosecond (upto 3RC), the voltage on capacitor will increase on about 14 mV, which is pretty easy to measure.

Then, we off the source voltage or even not, i.e. asynchronously use connected voltmeter chip, as I understand it is called “Analog to Digital Converter”, get current voltage on capacitor and pass data somewhere to microcontroller.

Should it work?

I want to measure distance by electromagnetic wave travel time.

Capacitor charging voltage on time dependency is well-known:

\$V_{capacitor} = V_{source}(1-e^{\dfrac{-t}{RC}}) \$

Equation above is often used to get voltage from given time.

But, we can use this equation to get time from given voltage.

Theoretically, if we know source voltage, loop resistance and capacitance, if we measure voltage at capacitor, we can get time.

My question, is how high the precision of the measurement can be? Can the accuracy be about picoseconds?


If we have a loop with 5V source, 1 Ohm resistance, and 1 nF capacitor, every 3 picosecond (upto 3RC), the voltage on capacitor will increase on about 14 mV, which is pretty easy to measure.

Then, we off the source voltage or even not, i.e. asynchronously use connected voltmeter chip, as I understand it is called “Analog to Digital Converter”, get current voltage on capacitor and pass data somewhere to microcontroller.

Should it work?

Became Hot Network Question
added 3 characters in body
Source Link
brhans
  • 15.2k
  • 3
  • 38
  • 51

Capacitor charging voltage on time dependency is well-known:

\$V_{capacitor} = V_{source}(1-e^{\dfrac{-t}{RC}}) \$

Equation above is often used to get voltage from given time.

But, we can use this equation to get time from given voltage.

Theoretically, if we know source voltage, loop resistance and capacitycapacitance, if we measure voltage at capacitor, we can get time.

My question, is how high the precision of the measurement can be? Can the accuracy be about picoseconds?


If we have a loop with 5V source, 1 Ohm resistance, and 1 nF capacitor, every 3 picosecond (upto 3RC), the voltage on capacitor will increase on about 14 mV, which is pretty easy to measure.

Then, we off the source voltage or even not, i.e. asynchronously use connected voltmeter chip, as I understand it is called “Analog to Digital Converter”, get current voltage on capacitor and pass data somewhere to microcontroller.

Should it work?

Capacitor charging voltage on time dependency is well-known:

\$V_{capacitor} = V_{source}(1-e^{\dfrac{-t}{RC}}) \$

Equation above is often used to get voltage from given time.

But, we can use this equation to get time from given voltage.

Theoretically, if we know source voltage, loop resistance and capacity, if we measure voltage at capacitor, we can get time.

My question, is how high the precision of the measurement can be? Can the accuracy be about picoseconds?


If we have a loop with 5V source, 1 Ohm resistance, and 1 nF capacitor, every 3 picosecond (upto 3RC), the voltage on capacitor will increase on about 14 mV, which is pretty easy to measure.

Then, we off the source voltage or even not, i.e. asynchronously use connected voltmeter chip, as I understand it is called “Analog to Digital Converter”, get current voltage on capacitor and pass data somewhere to microcontroller.

Should it work?

Capacitor charging voltage on time dependency is well-known:

\$V_{capacitor} = V_{source}(1-e^{\dfrac{-t}{RC}}) \$

Equation above is often used to get voltage from given time.

But, we can use this equation to get time from given voltage.

Theoretically, if we know source voltage, loop resistance and capacitance, if we measure voltage at capacitor, we can get time.

My question, is how high the precision of the measurement can be? Can the accuracy be about picoseconds?


If we have a loop with 5V source, 1 Ohm resistance, and 1 nF capacitor, every 3 picosecond (upto 3RC), the voltage on capacitor will increase on about 14 mV, which is pretty easy to measure.

Then, we off the source voltage or even not, i.e. asynchronously use connected voltmeter chip, as I understand it is called “Analog to Digital Converter”, get current voltage on capacitor and pass data somewhere to microcontroller.

Should it work?

Source Link
Loading