It is completely premature to claim that the radioactivity will be "several orders of magnitude less".
The neutron flux produced by a fusion reactor will be much higher than in any fission reactor, and it must be absorbed in a shield, to produce heat, which will be the output of the fission reactor.
Choosing an appropriate material for the shield will minimize the quantity of radioactive material that is created per unit of output energy, but it is pretty certain that the radioactivity will not be "several orders of magnitude less".
The best that can be hoped is that it is possible to find a shield material that will produce only very small quantities of long-lived radioactive isotopes, so that, after a storage for not too many years, the radioactivity might decrease to be "several orders of magnitude less".
Nevertheless this remains to be demonstrated.
For example, any piece of steel present near a fusion reactor would produce copious amounts of cobalt 60, but that would decay to negligible radioactivity after a few hundreds years.
Moreover, to ensure the predicted low residual radioactivity, any shield material needs to be free of impurities, which even in very small quantities could produce dangerous radioactive isotopes.
The requirements for advanced purification will greatly increase the cost of the structural materials for fusion reactors. However this is not a new problem. Similar requirements are imposed on the structural materials for fission reactors, but the fusion reactors will not be any better from this point of view.
DT fusion reactors can produce much less radioactivity, particularly long lived radioactivity, than fission reactors, but there are some caveats.
First, the radioactivity is spread through a much larger volume of material. The cost of dealing with it will have a component related to the volume rather than the total radioactivity. It's not clear that dealing with fusion's waste problem will be cheaper than dealing with fission's.
Second, getting low induced radioactivity, and particularly low production of radioisotopes with long half lives, may require expensively low concentrations of impurities in the reactor materials. For example, the RAFM steel Eurofer 97, a top candidate for a DT reactor construction, contains a small amount of nitrogen. Even this trace caused problems from 14C production pushing the steel over a regulatory limit requiring the steel to be disposed of more expensively due to that 14C content.
(I also have seen a claim from Abdou that the Eurofer 97 for DEMO would cost $3B, just for the raw steel. I'm not clear where this estimate comes from but it could be due to the need to expensively purify the steel of impurities to avoid their activation.)
A thousand tons of molten radioactive lithium, exposed to air, would make a pretty satisfying boom.
The tritium being bred in the lithium had better amount to more than micrograms, because that will be fuel. Separating the day's few grams of tritium from the thousand tons of molten radioactive lithium coursing through miles of pipe is an exercise not yet tackled by fusion promoters.
> very small quantities that would be in a fusion reactor
Fusion reactors would involve much more tritium than in a fission reactor. In the latter, especially LWRs, tritium comes from rare ternary fission events. In a DT fusion reactor, T is a primary fuel. A 1 GWe fusion plant will burn about 150 kg of tritium a year. To illustrate how much that is: that quantity of tritium would be enough to raise 2 months worth of the average flow of the Mississippi River above the legal limit for drinking water.
These will all be guarded with pseudo military guns guards and gates just like fission, and monitored by the IAEA.
As for radiological safety, see point about tritium. And google tritium leak nuclear plant for good measure.
Youre right that there is less radioactivity in fusion plants, but maybe not enough less to really matter.