You could try adding the site-packages dir of the other virtualenv to your PYTHONPATH variable. Your mileage may vary, but I think it would work for the majority of the packages.
export PYTHONPATH=<other-env>/lib/python3.6/site-packages:$PYTHONPATH
(or the equivalent variable setting statement for your OS/Shell)
update
Not that the approach in the original answer above will just work for the same Python version, and no conflicting version requirements in the dependencies. So, it might work for a while, but not be reliable long term.
For example, say hypothetically the packages in VENV1 rely on Python requests version 2.20 - but packages in the second version rely on the current 2.32 version. Which requests library gets loaded will depend on the order of the site-package libraries in the path - but the worse part: things would work nicely until one package in VENV1 uses requests in a way that is incompatible with the 2.32 version: then things would break with a cryptic error message. (again, this is just one example. requests for one library, although likely to be requirements of packages on both sides, won't have backwards incompatibilities).
All in all, however, it is a great way to mitigate several copies of the same huge Python library, like tensorflow: merging the site packages, or using symbolic links so that more than one venv can "see" the tensorflow installed in a single disk location is a huge gain, since the install is in the order of a few gigabytes.
Also note that since 2024, tools like Astral's uv can add dependencies to Venvs using symbolic links, instead of copying all the files - that would make installing Tensorflow in a second Venv in the same computer (same Python version, etc...) really fast, and take no extra disk-space.
Another approach, if one gets conflicting Python versions, or hit package dependency incompatibilities would be to call everything that needs different libraries using Python subprocess, or something like Celery and making RPC (Remote Procedure Calls): that'd be harder to setup, but reliable once it's done. With celery, for example, it is possible to have a number of workers running in a separate venv, even with a different Python version - the project just should be careful with the import statements done in the workers and in the main process - but it would be super reliable.