cli v2.6.0: Editing items with `op item get | jq | op item edit` does not create new fields

unrob
unrob
Community Member

I'm reading the output of op item get (v2.6.0) and adding a brand new field to an existing section using jq. It looks like this:

op item get "top-secret" | 
  jq '.fields + [{
      id: "my-new-field",
      type: "STRING",
      purpose: "",
      label: "my-new-field",
      value: "very secret",
    }]' | 
  op item edit "top-secret"

This runs fine, a new version is generated but my-new-field is not present in the 1password item. Changing values that are already present in the item works fine, just adding new ones silently fails. Anyone knows if this is the expected behavior?

I've verified the resulting json by creating with it a new item, but would like to avoid deleting an existing one to keep versions linked. I've also made a gnarly jq filter to turn JSON into shell arguments for op item edit, but it honestly feels like a horrible hack I'd rather not maintain.

Would appreciate any guidance!


1Password Version: 8.8.0
Extension Version: Not Provided
OS Version: macOS 12.4
Browser:_ Not Provided

Comments

  • Hi @unrob! The main impediment here is that though we support piped input for creating an item, we do not currently support piped input for editing one. This means that the above op item edit "top-secret" will not see the provided json at all and when it executes it will just edit the item "in place", replacing it with itself and only incrementing the version. May I suggest that you use field assignments (shell arguments) for this task:

    Add a new custom field to an item's section:

    `op item edit 'My Example Item' 'section2.field5[phone]=1-234-567-8910'`
    

    (extracted from the docs: op item edit --help)

    Is there any reason why the above solution is not desirable in your case?

    Best,
    Andi

  • unrob
    unrob
    Community Member

    Thanks for the suggestion @andi.t_1P . While that does work and I can turn a JSON document into that format, it's less than desirable to use yet another format for structured data when JSON already works just fine. Working with more than one field through scripting is very cumbersome, as one needs not only to format the data in those url-encoded-ish key value pairs, but also find a way to properly read potentially multi-line values in order to feed them into op item edit programatically.

    All doable, of course, but terribly annoying in my opinion; here's an example of how I went about it:

    # we start with a jq function that looks like:
    # it takes a list of fields (i.e. .fields[])
    def fields_to_cli($delete_field_names; $separator):
     # section.field\.name[type|delete](=value)
      map(
        (.section.id // "") + (if .section then "." else "" end) + (.label | gsub("\\."; "\\."))+
        "["+(if .purpose == "PASSWORD" then "password" else "text" end)+"]="+.value
      ) +
      ($delete_field_names | map((.| gsub("\\."; "\\."))+"[delete]=")) |
      sort |
      join($separator);
    
    function deleted_fields_from() {
      # imaginary function that returns a json list of fields present in a 
      # 1password item that were removed from a correspondingly named file
      # in the local filesystem
      jq -s \
        '((.[1].fields | map(.id)) - (.[0].fields | map(.id)))' \
        "$1" <(op item get "${1%.json}")
    }
    
    # first, get the item
    op item get "top-secret" > top-secret.json
    
    # then edit the downloaded file
    jq ...
    
    # then figure out which fields were deleted
    deleted="$(deleted_fields_from "top-secret.json")" 
    
    # then pick a boundary/separator character (¬)
    # and hope there's no values or keys that use it
    # every value, delimited by that separator will be read
    # as an item of the `args` array
    IFS='¬' read -ra args < <(
      jq \
        --arg separator '¬' 
        --jsonarg delete_field_names "$deleted" \
        'fields_to_cli($delete_field_names; $separator)'
        "top-secret.json")
    
    # finally, expand that array, and hope the parser for `op item edit` doesn't change. 
    op item edit "top-secret" -- "${args[@]}"
    

    compare that to

    op item get "top-secret" | jq 'some fiter' | op item edit "top-secret"
    

    Latter is more desirable in my opinion, and would love to see straight-from-json updating in op at some point in the future. At the very least, I'd expect invoking op item edit without any key-value pairs should result in an error or warning, instead of it increasing the version.

  • Thanks for your suggestion. We already have an issue tracking the feature of allowing piped input for editing an item.

    All the best,
    Andi

  • unrob
    unrob
    Community Member

    Thanks for adding the feature request Andi!


    I think my script above is broken with multiline values, to account for that, I'll be joining fields with the null character (and hope it's both part of the content itself), then using xargs to pass them as arguments to op item edit:

    function edit_item_args () {
      # reads the existing op item into $remote
      # then grabs all field names to delete  
      jq -j -r --exit-status \
        --argjson remote "$(op item get "some-item")" \
        'def fields_to_cli($delete_field_names):
          map(
            (.section.id // "") + (if .section then "." else "" end) + (.label | gsub("\\."; "\\."))+
            "["+(if .purpose == "PASSWORD" then "password" else "text" end)+"]="+.value
          ) +
          ($delete_field_names | map((.| gsub("\\."; "\\."))+"[delete]=")) |
          sort |
          join("\u0000");
    
        (($remote.fields | map(.id // .label)) - map(.id // .label)) as $to_delete |
        fields_to_cli($to_delete)' "$1"
    }
    
    # finally, call our function, pipe to xargs and hope for the best!
    edit_item_args  <(jq '.fields + [{
      id: "my-new-field",
      type: "STRING",
      purpose: "",
      label: "my-new-field",
      value: "very secret",
    }]' op item get "some-item") | xargs -0 -r op item edit "some-item" --
    

    Still not ideal, but it gets the job done :)

  • Thanks for the feedback!

  • unrob
    unrob
    Community Member

    Same idea, now in golang.

    I'm not sure I can get around this by using Connect, but I'm hoping to find out soon. Either way, would really love to feed JSON into op item edit!

  • Hey @unrob:

    Thanks for the additional feedback!

    Jack

This discussion has been closed.