ESP32-S3 ADC reading issue
Posted: Thu Sep 01, 2022 4:18 pm
We are having issue getting the correct reading from our custom board with ESP32-S3-WROOM-1 module. In our design, we have connected ADC1_CHANNEL_6 (GPIO_NUM_7) in between two resistors, forming a voltage divider. The expected voltage range there is 1V5 to 2V7, which is of our battery power input.
During development we have connected a bench power supply instead of a battery, in series with a voltmeter. We are using this code to acquire the voltage reading:
This gives stable readings, as in all samples are within 10mV. But they are wrong! Sometimes up to 150mV! The offset is different each time we power cycle the system.
BUT when we reset the system, by pulling RST to GND, the readings suddenly get very accurate (+-4mV), the offset disappears!
We did not burn any eFuse or set Two-Point calibration values; we are using the module as it shipped.
Are we doing something wrong?
During development we have connected a bench power supply instead of a battery, in series with a voltmeter. We are using this code to acquire the voltage reading:
Code: Select all
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include "esp_system.h"
#include "esp_err.h"
#include "esp_log.h"
#include "freertos/FreeRTOS.h"
#include "freertos/task.h"
#include "driver/gpio.h"
#include "driver/adc.h"
#include "esp_adc_cal.h"
typedef long i32;
static esp_adc_cal_characteristics_t *adc_chars;
static const adc1_channel_t channel = ADC1_CHANNEL_6;
static const adc_bits_width_t width = ADC_WIDTH_BIT_12;
static const adc_atten_t atten = ADC_ATTEN_DB_11;
static uint8 adc_configured = 0;
void adc_init() {
/* gpio config */
esp_err_t err;
gpio_config_t io_conf = {};
io_conf.intr_type = GPIO_INTR_DISABLE;
io_conf.mode = GPIO_MODE_INPUT;
io_conf.pin_bit_mask = 1ULL << GPIO_NUM_7;
io_conf.pull_down_en = 0;
io_conf.pull_up_en = 0;
err = gpio_config(&io_conf);
/* config attenuation */
adc1_config_channel_atten(channel, atten);
/* config bit width */
adc1_config_width(width);
adc_chars = calloc(1, sizeof(esp_adc_cal_characteristics_t));
esp_adc_cal_value_t val_type = esp_adc_cal_characterize(ADC_UNIT_1,
atten,
width,
1100,
adc_chars);
}
i32 adc_read() {
#define BATADC_SAMPLE_SZ 32
// multisampling
for (u8 i = 0; i < BATADC_SAMPLE_SZ; i++) {
i32 raw = adc1_get_raw((adc1_channel_t)channel);
adc_reading += raw;
sys_delay_us(20);
}
adc_reading /= (BATADC_SAMPLE_SZ);
voltage = esp_adc_cal_raw_to_voltage(adc_reading, adc_chars) * 2u;
return voltage;
}
void app_main(void)
{
adc_init();
for (;;) {
ESP_LOGI(TAG, "VBat: %ld", adc_read());
vTaskDelay(1000);
}
}
BUT when we reset the system, by pulling RST to GND, the readings suddenly get very accurate (+-4mV), the offset disappears!
We did not burn any eFuse or set Two-Point calibration values; we are using the module as it shipped.
Are we doing something wrong?