I'm not sure if it works that way.
Radiocarbon dating works by detecting the amount of carbon-14 in something. The premise is that when an organism consumes carbon (12 and 14), carbon-14 enters their system. Once something dies, the organism stops taking in carbon-14. As carbon-14 has a relatively long half-life (5730 years), it is used to date the age of something. Once something is buried, its carbon-14 supply starts decreasing.
In this case. Suppose an abundance at a time is at 1.0ppt (parts per trillion)
14C in our control group of animals. They die and their fossils are preserved. Then suppose an animal existing at the same time but at a different place for generations ate grass that had ashes from a nearby forest rained on it for a long period of time. They will have a more carbon-rich diet, so suppose they have an abundance of 1.1ppt
14C. After this half-life expires, the control group now has 0.5ppt
14C, and the other species now has 0.65ppt
14C.
Supposing this experiment were done modern-day where the standard
14C concentration is 1.0ppt, 5730 years from now, radiocarbon dating methods would deem the second group to be present 1322 years after the first group.