As a few of the experts here have already explained, it doesn't really make sense to compare because it's really different worlds.
In the analog world, you're 'battling' signal to noise, distortion, etc ... even the junkiest analog is still a 'pure' continuous signal, just a crummy, noisy one. With crappy digital, you wouldn't really have noise problems, it would just be a badly constructed recreation of the signal.
Tape speed, track width, etc. are only relevant in the analog world and have no digital equivalents. Even these things are probably less important than the electronic design of the machine itself (the full audio path). 30 ips is not objectively better than 15 ips, there are pros and cons to both (15 ips has better low end, 30 ips has better high end and lower noise). Tape width, maybe, but once you reach a certain point, it's really just better signal to noise.
I don't think oxide particles can be compared at all. The oxide is really just the canvas ... the digital equivalent is probably a hard-drive. I mean, the tape type does influence the sound, but again it really isn't comparable to anything in the digital realm, because in a sense NOTHING digital actually has a valid sonic 'texture' -- only the analog stages do. In theory, once digitized, it's 'perfect', i.e. it will not change. The signal on a strip of tape will actually change physically over time.
Another reason these comparisons are not relevant is in analog, these things are not subtle -- they're very obvious. The differences in digital once you reach a certain point are very subtle things. Lots of people can't tell the difference between standard 16-bit CD and the fancier digital choices people have these days. Ditto basic, cheap converters and super-$$$ ones.