Bringing out games which may require a gpu upgrade to play at a desired resolution/detail level/performance is one thing...but being pretty much forced to upgrade to be able to play games created by utilising what will arguably be one of the most popular engines in the next gen is a massively stupid idea.
From the comic like document that Dirtbox linked a few pages back, the 2012 top of line Kepler card ran the UE4 global illumination test scenes (that uses render to SparseVoxelOctrees) at 720p37, and 1080p16.
So with a bit of optimisation or scaling back it will still allow current systems to run UE4 and get the benefits of real-time GI. But for gamers to get high frame-rates and resolutions, it is either going to need big changes to that technology for UE4, or hardware that isn't available. Giving the computer industry a reason to throw away hardware for games and buy brand new shit has always been part of Pc gaming.