As time progressed each would begin to acquire its slower modern-day stable half-life, but would they all acquire these stable rates in a uniformity which would keep them all in synchrony? If they did, all would give the same ages, you are right.
A great book on the flaws of dating methods is "Radioisotopes and the age of the earth" (edited by Larry Vardiman, Andrew Snelling, Eugene F. Published by Institute for Creation Research; December 2000) Dating methods are based on 3 unprovable and questionable assumptions: 1) That the rate of decay has been constant throughout time. That the isotope abundances in the specimen dated have not been altered during its history by addition or removal of either parent or daughter isotopes 3) That when the rock first formed it contained a known amount of daughter material ("Radioisotopes and the age of the earth" pg v) We must recognize that past processes may not be occurring at all today, and that some may have occurred at rates and intensities far different from similar processes today.
Let's say initially every radioactive element was "exploded" into existence from pre-existent elements.