Page 1 of 1

i2c - Dealing with non-standard custom chip timing

Posted: Tue Apr 04, 2023 1:35 pm
by expresspotato
Hello,

So I've got an audio DSP here that relies on somewhat custom timing with regards to i2c. From the documentation there are varying amounts of delays that need to be accounted for when reading or writing from the chip.

- After the Start command, a delay of 20 us
- Between it's internal address low & high bytes, a delay of 24 us
- Before reading the returned data bytes, a delay of 16 us

For other delays, like reading back the stored register values they are in minimum ms. Which is fine, since command is sent as to which value to read in a single command, then read later on with another i2c command link.

Since each 10 us is one bit operation @ 100 kHz, there doesn't seem to be an easy way for me to add custom delays (especially in us) to an i2c_cmd_handle_t.

Does anyone have any experience / direction to take with this before I dig into the lower level code for days?

Image
Image

Re: i2c - Dealing with non-standard custom chip timing

Posted: Tue Apr 04, 2023 9:49 pm
by MicroController
Not 100% sure about the I2C driver implementation right now, but try and see if you can split up your communication into multiple "transactions", starting with a "transaction" which only contains a single START, then basically one write (or read) transaction per byte to have enough delays.

That chip has one pretty wild "I2C" interface someone cobbled together in software ;-)

Re: i2c - Dealing with non-standard custom chip timing

Posted: Wed Apr 05, 2023 12:07 pm
by expresspotato
Hello @MicroController,

Thanks kindly for the reply and your suggestion. The more I think about it, the more I think it's going to be a real problem because the chip requires 24 us, which is neither 1 nor 2 bits (10 us each) but something in between for some reason.

I will try just the start command followed by some arbitrary delay, but this may be hard to achieve accurately since the delay is done back on the CPU in user code rather than the lower level i2c code / hardware.

I suspect this could be done by creating an output clock and bit banging, but then I loose all the i2c library functionality...

Re: i2c - Dealing with non-standard custom chip timing

Posted: Sat Apr 15, 2023 3:16 pm
by MicroController
To me it seems that the specs are only giving minimums for the timings...

Re: i2c - Dealing with non-standard custom chip timing

Posted: Sun Apr 16, 2023 2:01 am
by expresspotato
After hours with the logic analyzer, the timing doesn't even follow the spec sheet when sent from its windows configuration utility. I ended up using the following library and adding the needed delays with esp_rom_delay_us.

https://github.com/tuupola/esp_software_i2c