In your scenario, although I wouldn't at that moment be having the experience of two-way interaction with my environment, I would have had it in the past. And, since I temporarily lack that interaction when I dream, it is clear that one can be conscious without having that interaction right now. But just because I can have consciousness without having that right now, it doesn't necessarily follow that I can have consciousness having never had that.
And that's a fundamental difference between conscious biological life and GPT-3. Conscious biological life experiences a two-way interaction with its environment, in which organism and environment act on each other simultaneously. GPT-3's experience of that is very limited. It has experienced the environment act on it (training), and it has experienced itself act on the environment (runtime), but those two experiences are largely siloed off from each other. (It effectively does have some runtime memory, so to a very limited degree it can dynamically react to the environment, but it can't actually learn anything at runtime.)
Now is that experience, which humans and animals have, but which GPT-3 lacks, essential to consciousness? Who really knows. The fact is, we don't really know what consciousness is, or what are the conditions for its existence. Maybe at least some history of that kind of two-way interaction is essential for consciousness, in which case GPT-3 can't have it (but some future successor system might). Maybe not. Nobody really knows.