Track artifacts, in-memory objects & folders [S3 storage]¶
import lamindb as ln
import pytest
ln.setup.login("testuser1")
ln.setup.init(storage="s3://lamindb-ci/test-upload")
✅ logged in with email testuser1@lamin.ai (uid: DzTjkKse)
💡 go to: https://lamin.ai/testuser1/test-upload
❗ updating & unlocking cloud SQLite 's3://lamindb-ci/test-upload/9f1dbf16f2b45bee83ebc4ce719ac4e4.lndb' of instance 'testuser1/test-upload'
❗ locked instance (to unlock and push changes to the cloud SQLite file, call: lamin close)
Local artifacts¶
Some test data.
pbmc68k = ln.core.datasets.anndata_pbmc68k_reduced()
Subset to a mini artifact to speed up the run time of this notebook:
pbmc68k = pbmc68k[:5, :5].copy()
pbmc68k
AnnData object with n_obs × n_vars = 5 × 5
obs: 'cell_type', 'n_genes', 'percent_mito', 'louvain'
var: 'n_counts', 'highly_variable'
uns: 'louvain', 'louvain_colors', 'neighbors', 'pca'
obsm: 'X_pca', 'X_umap'
varm: 'PCs'
obsp: 'connectivities', 'distances'
Upload from memory using explicit semantic key
¶
Upload h5ad¶
pbmc68k_h5ad = ln.Artifact.from_anndata(pbmc68k, key=f"test-upload/pbmc68k.h5ad")
❗ no run & transform get linked, consider calling ln.track()
pbmc68k_h5ad.save()
Artifact(uid='7V62SFbORoSU6HUHnvY1', key='test-upload/pbmc68k.h5ad', suffix='.h5ad', accessor='AnnData', size=100800, hash='bobVigHGtkrAHgPJojvpLA', hash_type='md5', visibility=1, key_is_virtual=True, created_by_id=1, storage_id=1, updated_at='2024-05-25 15:35:15 UTC')
pbmc68k_h5ad.delete(permanent=True)
Upload zarr¶
# Runs too long, should be tested elsewhere
# pbmc68k_zarr = ln.Artifact(pbmc68k, key="test-upload/pbmc68k.zarr", format="zarr")
# ln.save(pbmc68k_zarr)
# pbmc68k_zarr.delete(permanent=True, storage=True)
Upload using id
with implicit key
¶
Upload h5ad¶
pbmc68k_h5ad = ln.Artifact.from_anndata(pbmc68k, description="pbmc68k.h5ad")
❗ no run & transform get linked, consider calling ln.track()
pbmc68k_h5ad.save()
Artifact(uid='YnvKJOBPI04YK9JTqGi1', description='pbmc68k.h5ad', suffix='.h5ad', accessor='AnnData', size=100800, hash='bobVigHGtkrAHgPJojvpLA', hash_type='md5', visibility=1, key_is_virtual=True, created_by_id=1, storage_id=1, updated_at='2024-05-25 15:35:16 UTC')
pbmc68k_h5ad.delete(permanent=True, storage=True)
Upload zarr¶
# Runs too long, should be tested elsewhere
# pbmc68k_zarr = ln.Artifact(pbmc68k, name="pbmc68k.zarr", format="zarr")
# ln.save(pbmc68k_zarr)
# pbmc68k_zarr.delete(permanent=True, storage=True)
Error behaviors¶
Specified bucket does not exist.
with pytest.raises(FileNotFoundError):
pbmc68k_h5ad = ln.Artifact("s3://inexistent-bucket-239809834/pbmc68k.h5ad")
❗ no run & transform get linked, consider calling ln.track()
with pytest.raises(FileNotFoundError):
pbmc68k_h5ad = ln.Artifact("s3://lndb-setup-ci/pbmc68k.h5ad")
❗ no run & transform get linked, consider calling ln.track()
Test existing zarr¶
See test_artifact.py
for other artifact types.
This should probably go elsewhere:
# temporarily comment out because of head bucket permission error when
# attempting to get region
# artifact = ln.Artifact("s3://lamindb-ci/lndb-storage/pbmc68k.zarr")
# artifact.save()
# artifact.backed()
ln.setup.delete("test-upload", force=True)
💡 deleted storage record on hub f6bca7acd1615eeaafc3aafe3b8e85fb
💡 deleted instance record on hub 9f1dbf16f2b45bee83ebc4ce719ac4e4