Regarding Krylov, this is a good idea for a low-level algorithm we should supply with ITensor. However, this is in addition to many other things we should or would like to add (tensor slicing, other algorithms, etc) so it may take some time for us to add Krylov.
But in the meantime, it is something that should be quite possible to implement yourself. Good examples of similar algorithms written in ITensor can be found in this file:
In that file there are the Arnoldi, gmres, and Davidson algorithms.
These all follow similar principles, which is that one multiplies a large linear operator onto an initial 'state' or vector, then orthogonalizes the resulting produced vector against all previously produced vectors from earlier in the algorithm to build a Krylov subspace. Then one solves the desired problem, whether an eigenvalue or Ax=b problem etc., within the Krylov space.
The key operations of ITensor taking place in each of these algorithms are just contracting ITensors together (used when multiplying the vectors by the large linear operator and dot-producting the vectors with each other) and addition of ITensors. Most of the code is just dealing with scalars and small matrices to enact the logic of the algorithm, and to check for convergence and things.
Finally, regarding extracting the data of a dense ITensor, you can refer to this code formula page:
Please ask if you have a question about that page.
Unfortunately, things aren't as simple for extracting storage of block-sparse tensors i.e. IQTensors. I could give you a similar code to the above that would extract the low level storage of an IQTensor, but you'd still need to know how this storage is organized to read out the different blocks properly. Let me know if you definitely decide you need to do that. But it could be easier to just convert the IQTensor to a dense ITensor and then read that out instead.