Skip to content

Conversation

@krokicki
Copy link
Member

This PR adds some support for Cellmap-style N5 data sets. In particular, we assume something is a compatible data set if it has an attributes.json and there is a s0 child directory. We then read the attributes.json and s0/attributes.json and display relevant information in the metadata tables similar to the Zarr implementation. Links to Neuroglancer and "Copy data URL" are provided for now.

Feedback is welcome from anyone with some data sets to try. I'll ping everyone again once we have this up on the dev server, if you don't want to bother with building it locally.

@mkitti @JaneliaSciComp/fileglancer @StephanPreibisch @yuriyzubov

@mkitti
Copy link
Contributor

mkitti commented Dec 18, 2025

I'm just going to link a bunch of cross references here:

pixelResolution key, alternative to resolution

One common variant, I notice is a pixelResolution key:

Example from COSEM documentation:

"pixelResolution": {
    [ <x-resolution>, <y-resolution> <z-resolution> ],
    "unit": <spatial-units>
}

I think there should be a dimensions key there.

Example from the Neuroglancer documentation:

  "pixelResolution": {
    "unit": "nm",
    "dimensions": [4, 4, 30]
  }

There's an example from n5-viewer that shows pixelResolution can have an array as a direct value:

{"pixelResolution": [0.097, 0.097, 0.18]}

@mkitti
Copy link
Contributor

mkitti commented Dec 18, 2025

COSEM metadata refers to a transform key:

"transform": {
    "axes": [ "x", "y", "z" ], 
    "scale": [ <x-resolution>, <y-resolution> <z-resolution> ],
    "translate": [ <x-offset>, <y-offset>, <z-offset> ],
    "units": [ <x-units>, <y-units>, <z-units> ]
}

@mkitti
Copy link
Contributor

mkitti commented Dec 18, 2025

List of CellMap N5 datasets:

  • /nrs/cellmap/data_external/jrc_atla24-b9-1/jrc_atla24-b9-1.n5/em/fibsem-uint8
  • /nrs/cellmap/data_external/jrc_atla24-b9-2/jrc_atla24-b9-2.n5
  • /nrs/cellmap/data_external/jrc_atla52-b10/jrc_atla52-b10.n5
  • /nrs/cellmap/data_external//jrc_atla52-b10-2/jrc_atla52-b10-2.n5

Ran in /nrs/cellmap:

$ find "$(pwd)" -type d -name "*.n5" -maxdepth 3
/nrs/cellmap/data/hemi-brain_8x8x8nm/tmphemibrain.n5
/nrs/cellmap/data/jrc_amphiuma-means-heart-1/jrc_amphiuma-means-heart-1.n5
/nrs/cellmap/data/jrc_amphiuma-means-liver-1/jrc_amphiuma-means-liver-1.n5
/nrs/cellmap/data/jrc_aphid-salivary-1/jrc_aphid-salivary-1.n5
/nrs/cellmap/data/jrc_c-elegans-comma-1-two-channel/jrc_c-elegans-comma-1-two-channel.n5
/nrs/cellmap/data/jrc_hela-1/jrc_hela-1.n5
/nrs/cellmap/data/jrc_hela-1/jrc_hela-1_labels.n5
/nrs/cellmap/data/jrc_hum-airway-15056-2/jrc_hum-airway-15056-2.n5
/nrs/cellmap/data/jrc_hum-glioblastoma-1/jrc_hum-glioblastoma-1.n5
/nrs/cellmap/data/jrc_mosquito-stylet-5/jrc_mosquito-stylet-5.n5
/nrs/cellmap/data/jrc_mosquito-stylet-6/jrc_mosquito-stylet-6.n5
/nrs/cellmap/data/jrc_mus-cerebellum-1/jrc_mus-cerebellum-1.n5
/nrs/cellmap/data/jrc_mus-cortex-2/jrc_mus-cortex-2.n5
/nrs/cellmap/data/jrc_mus-cortex-4/jrc_mus-cortex-4.n5
/nrs/cellmap/data/jrc_mus-cortex-5/jrc_mus-cortex-5.n5
/nrs/cellmap/data/jrc_mus-duodenum-1/jrc_mus-duodenum-1.n5
/nrs/cellmap/data/jrc_mus-duodenum-1a/jrc_mus-duodenum-1a.n5
/nrs/cellmap/data/jrc_mus-heart-2/jrc_mus-heart-2.n5
/nrs/cellmap/data/jrc_mus-heart-2_slow/jrc_mus-heart-2_slow.n5
/nrs/cellmap/data/jrc_mus-heart-3/jrc_mus-heart-3.n5
/nrs/cellmap/data/jrc_mus-heart-4/jrc_mus-heart-4.n5
/nrs/cellmap/data/jrc_mus-heart-5/jrc_mus-heart-5.n5
/nrs/cellmap/data/jrc_mus-heart-6/jrc_mus-heart-6.n5
/nrs/cellmap/data/jrc_mus-kidney-4/jrc_mus-kidney-4.n5
/nrs/cellmap/data/jrc_mus-liver-zon-1/jrc_mus-liver-zon-1.n5
/nrs/cellmap/data/jrc_mus-liver-zon-3/jrc_mus-liver-zon-3.n5
/nrs/cellmap/data/jrc_mus-lung-2a/jrc_mus-lung-2a.n5
/nrs/cellmap/data/jrc_mus-pancreas-7/jrc_mus-pancreas-7.n5
/nrs/cellmap/data/jrc_mus-skel-muscle-1/jrc_mus-skel-muscle-1.n5
/nrs/cellmap/data/jrc_mus-skin-2/jrc_mus-skin-2.n5
/nrs/cellmap/data/jrc_mus-skin-2a/jrc_mus-skin-2a.n5
/nrs/cellmap/data/jrc_mus-thymus-2/jrc_mus-thymus-2.n5
/nrs/cellmap/data/jrc_mus-thymus-2a/jrc_mus-thymus-2a.n5
/nrs/cellmap/data/jrc_mus_lung-2/jrc_mus_lung-2.n5
/nrs/cellmap/data/jrc_velella-b8-1/jrc_velella-b8-1.n5
/nrs/cellmap/data/jrc_zf-pancreas-1/jrc_zf-pancreas-1.n5
/nrs/cellmap/data/test_albert/jrc_mus_cortex_1.n5
/nrs/cellmap/data_external/jrc_atla24-b9-1/jrc_atla24-b9-1.n5
/nrs/cellmap/data_external/jrc_atla24-b9-2/jrc_atla24-b9-2.n5
/nrs/cellmap/data_external/jrc_atla52-b10/jrc_atla52-b10.n5
/nrs/cellmap/data_external/jrc_atla52-b10-2/jrc_atla52-b10-2.n5

@krokicki
Copy link
Member Author

Thanks, @mkitti !

I tried the 4 examples you listed and they all seem to work.

The Neuroglancer docs you linked say that "The pixelResolution attribute is not recommended". As we discussed, I'm not sure it's a good idea for Fileglancer to attempt to support all possible metadata schemas for N5. There is an easy fallback, which is just viewing the metadata directly (we'll add JSON formatting to make that easier.)

Please let me know if there are remaining other "must have" requirements for N5 for this feature to be useful.

@mkitti
Copy link
Contributor

mkitti commented Dec 19, 2025

I do agree we should not support everything, but I am looking at datasets modified timestamps from a week ago that use pixelResolution:
/nrs/cellmap/data/jrc_amphiuma-means-heart-1/jrc_amphiuma-means-heart-1.n5/render/jrc_amphiuma_means_heart_1/v1_acquire_align___20251212_193816/attributes.json

I just spoke with @trautmane, and he just confirmed for me that the active render pipeline writes pixelResolution, units, scales, and translate:
https://github.com/saalfeldlab/render/blob/de0b314722af15c987f930c9398e086d08dbfe75/render-app/src/main/java/org/janelia/alignment/util/NeuroglancerAttributes.java#L175-L178

        attributes.put("units", units);
        attributes.put("scales", scales);
        attributes.put("pixelResolution", pixelResolution);
        attributes.put("translate", translate);

There are also seems to be markdown documentation which is distinct from the rST documentaiton.
https://github.com/google/neuroglancer/blob/master/src/datasource/n5/README.md
https://github.com/google/neuroglancer/blob/master/src/datasource/n5/index.rst

Both the render pipeline and and the Neuroglancer documentation have the following keys in common:

  • units
  • scales
  • pixelResolution

There seems to redundancy between pixelResolution.unit and units.

I think the priority is then as follows:

  • units takes precedence over pixelResolution.unit since units allows each axis to have a distinct unit.
    • If neither units or pixelResolution.unit is present, then we assume the numbers are in micrometers.
  • downsamplingFactors takes precedence over scales

@yuriyzubov
Copy link

@mkitti @krokicki thanks!

I think it'll be great if we add validation for metadata required for neuroglancer, and n5-viewer. I also agree with @mkitti's comment about pixel resolution.

I am not sure about Cosem metadata though. In addition to the cosem metadata, all cosem/cellmap n5 datasets have all attributes (pixel resolution, scales, and units) to be compatible with neuroglancer. Perhaps we should not include cosem metadata schema in the list of valid schemas, since it's not being actively used or supported. If we want to display cosem metadata, units, scales, and pixelResolution should take precedence over transform.

@bogovicj
Copy link

I described a couple of "dialects" that have been used in the building awhile ago:
https://github.com/saalfeldlab/n5-ij/wiki/N5-Metadata-Dialects
and there are even more than these.

Agreed its not worth supporting them all.

@mkitti
Copy link
Contributor

mkitti commented Dec 19, 2025

My proposal is basically we should support (set of keys supported by Neuroglancer) ∩ (set of keys actively used at Janelia).

At the moment, that intersection seems to be equivalent to (set of keys supported by Neuroglancer). The set of keys supported by Neuroglancer is currently a strict subset of the keys used at Janelia. This is in part because we have made changes so that we export keys supported by Neuroglancer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants