Given JSON data and a query, there is no
option in
jq
that, instead of printing the value itself, prints the location
of possible matches.
This is because JSON parsers providing an interface to developers usually focus on processing the logical structure of a JSON input, not the textual stream conveying it. You would have to instruct it to explicitly treat its input as raw text, while properly parsing it at the same time in order to extract the queried value. In the case of jq
, the former can be achieved using the --raw-input
(or -R
) option, the latter then by parsing the read-in JSON-encoded string using fromjson
.
The -R
option alone would read the input linewise into an array of strings, which would have to be concatenated (e.g. using add
) in order to provide the whole input at once to fromjson
. The other way round, you could also provide the --slurp
(or -s
) option which (in combination with -R
) already concatenates the input to a single string which then, after having parsed it with fromjson
, would have to be split again into lines (e.g. using /"\n"
) in order to provide row numbers. I found the latter to be more convenient.
That said, this could give you a starting point (the --raw-output
(or -r
) option outputs raw text instead of JSON):
jq -Rrs '
"\(fromjson.key2.key2_2.key2_2_1)" as $query # save the query value as string
| ($query | length) as $length # save its length by counting its characters
| ./"\n" | to_entries[] # split into lines and provide 0-based line numbers
| {row: .key, col: .value | indices($query)[]} # find occurrences of the query
| "(\(.row),\(.col)) (\(.row),\(.col + $length))" # format the output
'
(5,24) (5,34)
Now, this works for the sample query, how about the general case? Your example queried a number (1.43123123
) which is an easy target as it has the same textual representation when encoded as JSON. Therefore, a simple string search and length count did a fairly good job (not a perfect one because it would still find any occurrence of that character stream, not just “values”). Thus, for more precision, but especially with more complex JSON datatypes being queried, you would need to develop a more sophisticated searching approach, probably involving more JSON conversions, whitespace stripping and other normalizing shenanigans. So, unless your goal is to rebuild a full JSON parser within another one, you should narrow it down to the kind of queries you expect, and compose an appropriately tailored searching approach. This solution provides you with concepts to simultaneously process the input textually and structurally, and with a simple search and ouput integration.
2
solved Finding the location (line, column) of a field value in a JSON file