Some configuration of Ruby use RUBY_DEVEL, which depends PATCH_LEVEL.
But depending PATCH_LEVEL causes issues which will become revealed on the final release.
Though we release some previews and RCs, they don't contributes the quality around RUBY_DEVEL.
Therefore to ensure CI tests the quality of the final release, we need to deprecate RUBY_DEVEL.
I looked into this. If we removed RUBY_DEVEL, we would make it so ruby_debug_log was never defined, since it is only defined if RUBY_DEVEL is defined. Basically, removing RUBY_DEVEL is equivalent to removing the removing the entire debug logging feature.
I think a safer and less invasive change is to make RUBY_DEVEL not depend on PATCH_LEVEL. By doing so, Ruby developers that want to use debug logging can still manually use RUBY_DEVEL=yes when configuring, but it will never be enabled automatically. I submitted a pull request for this: https://github.com/ruby/ruby/pull/4993. One change it makes is to set USE_RUBY_DEBUG_LOG to 1 instead of 0 if RUBY_DEVEL is defined and USE_RUBY_DEBUG_LOG isn't. Since RUBY_DEVEL is no longer enabled automatically, the main reason to enable it would be to support debug logging, in which case I think 1 is a better default.
When packaging a new tarball, we faced this issue again. Building a tarball produces so many warnings:
./vm_debug.h:34:5: warning: "RUBY_DEVEL" is not defined, evaluates to 0 [-Wundef]
34 | #if RUBY_DEVEL
| ^~~~~~~~~~
./vm_debug.h:100:5: warning: "USE_RUBY_DEBUG_LOG" is not defined, evaluates to 0 [-Wundef]
100 | #if USE_RUBY_DEBUG_LOG
| ^~~~~~~~~~~~~~~~~~
In file included from ./vm.c:40:
./vm_sync.h:7:5: warning: "USE_RUBY_DEBUG_LOG" is not defined, evaluates to 0 [-Wundef]
7 | #if USE_RUBY_DEBUG_LOG
| ^~~~~~~~~~~~~~~~~~