Skip to content
This repository was archived by the owner on Jan 5, 2024. It is now read-only.
This repository was archived by the owner on Jan 5, 2024. It is now read-only.

key error when layers have different flattened coordinates #67

@jenellefeather

Description

@jenellefeather

I was running brainscore on some transformer models and ran into an issue with channel_x not being a key in a dictionary. It seems to be noted in the code:

# using these names/keys for all assemblies results in KeyError if the first layer contains flatten_coord_names

Log here for the failed model:
http://braintree.mit.edu:8080/job/run_benchmarks/3861/parsed_console/job/run_benchmarks/3861/parsed_console/log.html

I found that I also could not include the final fc (logits) layer as one of the places where I was grabbing the activations, as this caused a key error ('embedding' was missing). I just removed these parts of the model from scoring, but it seems like something that others might run into.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions