In a specific configuration (Source -> TV -> eARC on Receiver) and I don?t think the cable will make a huge difference, provided it?s bandwidth is fine. If I remember the specifications right, HDMI 2.1 has a roughly 40Mbits/s max audio bandwidth. That?s less than 1% of the overall 48Gbits/s bandwidth. So there will likely be image problems long before audio ones. I was running 4K HDR 120hz through an old HDMI cable for a Nintendo WII U and it worked. With an Amazon Basics HDMI 1.4 cable there were image cut outs. Never had audio issues.
the HDMI 2.1 standard 40 - 48 GBPS is all about the picture ie 4K/ 8K 60/120 Hz etc
eARC is something different, its an enhanced version of the Audio Return Channel (ARC) standard ie the eARC Audio Return Channel has higher band width than ARC thus audio needs not be compressed under the eARC standard and in effect more and accordingly uncompressed audio can flow down the eARC channel from TV to AVR or soundbar. Thus in theory you can pass Dolby TRue HD with Atmos (uncompressed audio) from TV to AVR under eARC rather than just DDD+ (compressed audio) under ARC.
Under the eARC standard one can deliver up to 32 channels of audio, including eight-channel, 24bit/192kHz uncompressed data streams at speeds of up to 38Mbps.
This means those uncompressed formats on Blu-ray discs, 4K Blu-rays ? Dolby TrueHD, DTS-HD Master Audio and formats such as Dolby Atmos and DTS:X ? should all be compatible.
Thus under eARC you can plug your BluRay player directly into a TV HDMI port and it will pass all those uncompressed audio formats down to your AVR (subject of course to the TV manufacturer's degree of support for those formats, and of the eARC standard, so yes like everything today can be a cluster f_ck of note where standards are not equally or completely complied with from manufacturer to manufacturer.)
Re audioquest special eARC cables???? Not sure but the cable needs to support eARC and I believe if a cable complies with HDMI 2.1 by default eARC is also supported.