Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to load jobs which finished in old kernel (pyiron/old latest Python3.10) in new kernel (pyiron/old latest Python3.11) #1585

Open
lfzhu-phys opened this issue Oct 14, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@lfzhu-phys
Copy link

lfzhu-phys commented Oct 14, 2024

I have two sets of data. One set were finished under pyiron/old latest Python3.10 and another set were finished under pyiron/old latest Python3.11. I need to analyze the data in one notebook. However, each of the kernel does not allow me to load the data which finished from another kernel. Any clever way to solve this without rerunning these jobs? Thanks a lot in advance.

The error message:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[18], line 26
     24 #jobname_step4 = job_name[:-((len(job_name[job_name.rindex('_')+1:]))+1)]
     25 print(jobname_step4)
---> 26 ham_step4 = pr_step4[jobname_step4]
     27 jobname_step4_ref = 'min20g_' + alat_str
     28 ham_step4_ref = pr_step4_ref[jobname_step4_ref]

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/project/generic.py:1877, in Project.__getitem__(self, item)
   1873         except ValueError:
   1874             return self._get_item_helper(
   1875                 item=item_lst[0], convert_to_object=True
   1876             ).__getitem__("/".join(item_lst[1:]))
-> 1877 return self._get_item_helper(item=item, convert_to_object=True)

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/project/generic.py:1939, in Project._get_item_helper(self, item, convert_to_object)
   1937     if self._inspect_mode or not convert_to_object:
   1938         return self.inspect(item)
-> 1939     return self.load(item)
   1940 if item in self.list_files(extension="h5"):
   1941     file_name = posixpath.join(self.path, "{}.h5".format(item))

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/project/jobloader.py:105, in JobLoader.__call__(self, job_specifier, convert_to_object)
    104 def __call__(self, job_specifier, convert_to_object=None) -> GenericJob:
--> 105     return super().__call__(job_specifier, convert_to_object=convert_to_object)

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/project/jobloader.py:76, in _JobByAttribute.__call__(self, job_specifier, convert_to_object)
     72     state.logger.warning(
     73         f"Job '{job_specifier}' does not exist and cannot be loaded"
     74     )
     75     return None
---> 76 return self._project.load_from_jobpath(
     77     job_id=job_id,
     78     convert_to_object=(
     79         convert_to_object
     80         if convert_to_object is not None
     81         else self.convert_to_object
     82     ),
     83 )

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_atomistics/project.py:333, in Project.load_from_jobpath(self, job_id, db_entry, convert_to_object)
    319 def load_from_jobpath(self, job_id=None, db_entry=None, convert_to_object=True):
    320     """
    321     Internal function to load an existing job either based on the job ID or based on the database entry dictionary.
    322 
   (...)
    331         GenericJob, JobCore: Either the full GenericJob object or just a reduced JobCore object
    332     """
--> 333     job = super(Project, self).load_from_jobpath(
    334         job_id=job_id, db_entry=db_entry, convert_to_object=convert_to_object
    335     )
    336     job.project_hdf5._project = self.__class__(path=job.project_hdf5.file_path)
    337     return job

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/project/generic.py:1189, in Project.load_from_jobpath(self, job_id, db_entry, convert_to_object)
   1187 job = JobPath.from_job_id(db=self.db, job_id=job_id)
   1188 if convert_to_object:
-> 1189     job = job.to_object()
   1190     job.reset_job_id(job_id=job_id)
   1191     job.set_input_to_read_only()

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/jobs/job/core.py:661, in JobCore.to_object(self, object_type, **qwargs)
    655 if self.project_hdf5.is_empty:
    656     raise ValueError(
    657         'The HDF5 file of this job with the job_name: "'
    658         + self.job_name
    659         + '" is empty, so it can not be loaded.'
    660     )
--> 661 return self.project_hdf5.to_object(object_type, **qwargs)

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/storage/hdfio.py:1282, in ProjectHDFio.to_object(self, class_name, **kwargs)
   1268 def to_object(self, class_name: Optional[str] = None, **kwargs) -> object:
   1269     """
   1270     Load the full pyiron object from an HDF5 file
   1271 
   (...)
   1280         pyiron object of the given class_name
   1281     """
-> 1282     return _to_object(self, class_name, **kwargs)

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/storage/hdfio.py:179, in _to_object(hdf, class_name, **kwargs)
    176 init_args.update(kwargs)
    178 obj = class_object(**init_args)
--> 179 obj.from_hdf(hdf=hdf.open(".."), group_name=hdf.h5_path.split("/")[-1])
    180 if static_isinstance(obj=obj, obj_type="pyiron_base.jobs.job.generic.GenericJob"):
    181     module_name = module_path.split(".")[0]

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_atomistics/lammps/interactive.py:499, in LammpsInteractive.from_hdf(self, hdf, group_name)
    491 def from_hdf(self, hdf=None, group_name=None):
    492     """
    493     Recreates instance from the hdf5 file
    494 
   (...)
    497         group_name (str): Name of the group which contains the object
    498     """
--> 499     super(LammpsInteractive, self).from_hdf(hdf=hdf, group_name=group_name)
    500     self.species_from_hdf()

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/jobs/job/interactive.py:394, in InteractiveBase.from_hdf(self, hdf, group_name)
    382 def from_hdf(
    383     self,
    384     hdf: Optional["pyiron_base.storage.hdfio.ProjectHDFio"] = None,
    385     group_name: Optional[str] = None,
    386 ):
    387     """
    388     Restore the InteractiveBase object in the HDF5 File
    389 
   (...)
    392         group_name (str): HDF5 subgroup name - optional
    393     """
--> 394     super(InteractiveBase, self).from_hdf(hdf=hdf, group_name=group_name)
    395     with self.project_hdf5.open("input") as hdf5_input:
    396         if "interactive" in hdf5_input.list_nodes():

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_base/jobs/job/generic.py:1308, in GenericJob.from_hdf(self, hdf, group_name)
   1306     exe_dict["READ_ONLY"] = self._hdf5["executable/executable/READ_ONLY"]
   1307     job_dict["executable"] = {"executable": exe_dict}
-> 1308 self.from_dict(obj_dict=job_dict)

File /cmmc/ptmp/pyironhb/mambaforge/envs/pyiron_mpie_cmti_2024-10-07/lib/python3.11/site-packages/pyiron_atomistics/lammps/base.py:702, in LammpsBase.from_dict(self, obj_dict)
    700 super().from_dict(obj_dict=obj_dict)
    701 self._structure_from_dict(obj_dict=obj_dict)
--> 702 self.input.from_dict(obj_dict=obj_dict["input"])

TypeError: MlipInput.from_dict() got an unexpected keyword argument 'obj_dict'

Edit: I (Niklas) took the liberty to wrap the error into a traceback block

@niklassiemer
Copy link
Member

Do I see it correctly, that you refer to the kernel pyiron/old_latest with the version from the following snipped

import sys
import pyiron_atomistics
import pyiron_base

print(sys.version)
print(pyiron_atomistics.__version__)
print(pyiron_base.__version__)

being

3.10.14 | packaged by conda-forge | (main, Mar 20 2024, 12:45:18) [GCC 12.3.0]
0.5.4
0.8.2

and the pyiron/latest kernel with the versions

3.11.6 | packaged by conda-forge | (main, Oct  3 2023, 10:40:35) [GCC 12.3.0]
0.6.13
0.10.2

What kind of job fails to be loaded? It seems to be a Lammps calculation with Mlip potential? I would assume the same error occurs for any such job being computed with the old versions and loaded with the new versions?

In that case we have a backwards compatibility problem needing a fix.

@niklassiemer niklassiemer added the bug Something isn't working label Oct 15, 2024
@lfzhu-phys
Copy link
Author

@niklassiemer Thanks for looking into it. One set of the data calculated under kernel 3.10 are lammps calculation using Mlip potential and another set of the data are static VASP calculations using kernel 3.11. Each kernel does not allow me to load the data from another kernel:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants