profile
viewpoint
Rijnard van Tonder rvantonder @sourcegraph Phoenix, AZ rijnard.com

ddcc/15745-s15-project 2

15-745 Optimizing Compilers S15 Project

agroce/mutantsnsfsmall 1

NSF proposal on mutation testing

comby-tools/sample-catalog 1

A sample catalog with rewrite templates. Feel free to fork and modify with your own!

rvantonder/archat 1

server client chat

rvantonder/bandwidth-tester 1

test tcp/udp bandwidth on point to point connection

rvantonder/2048 0

2048 Game with Kivy

rvantonder/afl 0

american fuzzy lop (copy of the source code for easy access)

rvantonder/afl-compiler-fuzzer 0

Variation of american fuzzy lop for testing compilers for C-like languages

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha 92c61ca6bc7fb8af5f26505406477f1d5be918d7

search: update integration tests for parser migration (#12820)

view details

push time in 26 minutes

delete branch sourcegraph/sourcegraph

delete branch : rvt/update-search-integration-tests-for-parser-migration

delete time in 26 minutes

PR merged sourcegraph/sourcegraph

search: update integration tests for parser migration

Happy to say that the parser migration passes all of the existing search integration tests. The new parser is stricter about enforcing that type:file is specified with stable:yes (addresses https://github.com/sourcegraph/sourcegraph/issues/9715), so the relevant tests are updated here. There are also some queries that have stable:yes, that we don't technically support yet, so I have updated those tests as well.

The assumption from here on out is that these tests work consistently with the new parser, until I/we put @unknwon's testing work into CI.

There is one discrepancy for a test that returns a slightly different suggested query for a particular alert (ordering is different). This is too low value to try and make consistent, but just noting it.

Old:

Screen Shot 2020-08-06 at 8 12 06 PM

New:

Screen Shot 2020-08-06 at 8 12 46 PM

+11 -11

0 comment

1 changed file

rvantonder

pr closed time in 26 minutes

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha dd1d5dd5f33027f1b102c7641c509116bb2f0a25

remove replacer service (#12812)

view details

push time in 2 hours

delete branch sourcegraph/sourcegraph

delete branch : rvt/replacer-agane

delete time in 2 hours

PR merged sourcegraph/sourcegraph

remove replacer service (take 2)

I will merge this again 10:00 AM PT Aug 7. No need to approve again.

+8 -1523

1 comment

45 changed files

rvantonder

pr closed time in 2 hours

issue openedaws/amazon-freertos

Experimenting with CBMC with and without patches

Hi 👋 . I chatted a bit with @danielsn and I'm interested in experimenting with some of the patch preprocessing used with this project and cbmc, similar to the ideas of the existing patches. I was able to get up and running with the ctest command in the README.

The tests seem flaky (I'm on Mac OS X). Either two or four tests sometimes fail, but I'll just ignore this unless this is a real problem. For reference:

The following tests FAILED:
         12 - IotHttpsClient_Disconnect-report (Failed)
         28 - IotHttpsClient_SendSync-report (Failed)
The following tests FAILED:
          8 - IotHttpsClient_Connect-report (Failed)
         12 - IotHttpsClient_Disconnect-report (Failed)
         20 - IotHttpsClient_ReadHeader-report (Failed)
         28 - IotHttpsClient_SendSync-report (Failed)

Couple of questions:

I'd basically like to, for example, add my own patch and see whether that changes the behavior of cbmc. To do that, it'd be helpful to understand how the current patches are applied to some analysis pipeline. I looked at some of the scripts, makefiles, and patches, but it's not clear to me how they fit together. It'd be great if there's a simple rundown of:

  • What command to enter that uses the patches for some analysis (does ctest do this, or is there another command?)

  • The scripts and steps of what the above command invokes (so that I can understand how they fit together, and modify parts of it). I'm interested in adding or removing patches, understanding how they affect the current analysis, and what the analysis target is (set of programs or subparts that are analyzed).

Thanks!

created time in 11 hours

issue commentsourcegraph/sourcegraph

syntect-server: frequent worker deaths, highlighting timeouts, highlighting errors

worker deaths are a different topic - that means the server got entirely stuck and it had to kill one of its 8 subprocesses, leading to 1/8th of the pending requests failing. Those should generally never happen.

@slimsag any chance you saw my comment in https://github.com/sourcegraph/sourcegraph/issues/12347#issuecomment-669711123?

bobheadxi

comment created time in 17 hours

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 Click the <img src=../img/brackets.png> toggle to activate structural search. St  Note: It is not possible to perform case-insensitive matching with structural search. -## Keywords (all searches)+## Keywords  -The following keywords can be used on all searches (using [RE2 syntax](https://golang.org/s/re2syntax) any place a regex is accepted):+Unless stated otherwise, the following keywords can be used on all searches (using [RE2 syntax](https://golang.org/s/re2syntax) any place a regex is accepted):

Agree, I think it might be the the exception about -content (but that's a bit unclear). Easy way is just to not mention it in the table, and add it to the operator section below. Also for the stated reason that it is guarded by an option currently.

stefanhengl

comment created time in 17 hours

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 Returns results for files containing matches on the left _and_ right side of the  Returns file content matching either on the left or right side, or both (set union). The number of results reports the number of matches of both strings. +| Operator | Example |+| --- | --- |+| `not`, `NOT` | [`panic not file:main.go lang:go`](https://sourcegraph.com/search?q=panic+not+file:main.go+lang:go&patternType=literal),++`NOT <keyword>:` is equivalent to `-<keyword>:`. `NOT` can only stand before negatable keywords, such as `file`, `content`, `lang`, `repohasfile`, and `repo`. +

Yeah, the value is in an alternative (arguably) more readable syntax. Note that this is similar to the Lucene syntax. NOT is exactly equivalent, yes. It's just that we are also adding a new capability (and documenting it in this PR) that it is now possible to do:

NOT pattern

However it should be clear that NOT pattern is only guaranteed to work when type:XXX is not specified. Because type:XXX changes the meaning/interpretation of something like pattern type:repo (we will search for repos, not file contents). Thus, if type:XXX is specified, all bets are off about the usage and expectations of NOT pattern.

stefanhengl

comment created time in 17 hours

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 The following keywords can be used on all searches (using [RE2 syntax](https://g | **repogroup:group-name** <br> _alias: g_ | Only include results from the named group of repositories (defined by the server admin). Same as using a repo: keyword that matches all of the group's repositories. Use repo: unless you know that the group exists. | | | **file:regexp-pattern** <br> _alias: f_ | Only include results in files whose full path matches the regexp. | [`file:\.js$ httptest`](https://sourcegraph.com/search?q=file:%5C.js%24+httptest) <br> [`file:internal/ httptest`](https://sourcegraph.com/search?q=file:internal/+httptest) | | **-file:regexp-pattern** <br> _alias: -f_ | Exclude results from files whose full path matches the regexp. | [`file:\.js$ -file:test http`](https://sourcegraph.com/search?q=file:%5C.js%24+-file:test+http) |-| **content:"pattern"** | Explicitly override the [search pattern](#search-pattern-syntax). Useful for explicitly delineating the pattern to search for if it clashes with other parts of the query. | [`repo:sourcegraph content:"repo:sourcegraph"`](https://sourcegraph.com/search?q=repo:sourcegraph+content:"repo:sourcegraph"&patternType=literal) |+| **content:"pattern"** | Set the search pattern with a dedicated parameter. Useful when searching literally for a string that may conflict with the [search pattern syntax](#search-pattern-syntax). | [`repo:sourcegraph content:"repo:sourcegraph"`](https://sourcegraph.com/search?q=repo:sourcegraph+content:"repo:sourcegraph"&patternType=literal) |+| **-content:"pattern"** | Exclude results from files whose content matches the pattern. Note: `-content` is currently only supported for literal and regexp patterns on indexed repositories. | [`file:Dockerfile alpine -content:alpine:latest`](https://sourcegraph.com/search?q=file:Dockerfile+alpine+-content:alpine:latest&patternType=literal) |

Yeah it's best to clone locally to see the actual rendering

stefanhengl

comment created time in 17 hours

PR opened sourcegraph/sourcegraph

Reviewers
search: update integration tests for parser migration

Happy to say that the parser migration passes all of the existing search integration tests. The new parser is stricter about enforcing that type:file is specified with stable:yes, so the relevant tests are updated here. There are also some queries that have stable:yes, that we don't technically support yet, so I have updated those tests as well.

The assumption from here on out is that these tests work consistently with the new parser, until I/we put @unknwon testing into CI.

There is one discrepancy for a test that returns a slightly different suggested query for a particular alert (ordering is different). This is too low value to try and make consistent, but just noting it.

Old:

Screen Shot 2020-08-06 at 8 12 06 PM

New:

Screen Shot 2020-08-06 at 8 12 46 PM

+11 -11

0 comment

1 changed file

pr created time in 17 hours

issue commentsourcegraph/sourcegraph

Surface alert if type: other than file is used with stable:

This is enforced by the new parser ("migrateParser" : true). Will close when migration is complete.

rvantonder

comment created time in 18 hours

issue commentsourcegraph/sourcegraph

Remove or resolve Codemod types in extensions

I was able to reproduce this locally when switching branches during local dev, on one branch where replacer was removed, and then to a branch where it exists (and local dev rebuilds the things it think it should at runtime). After switching to the one where it supposedly exists, the frontend shows the Unknown type issue (even with caching disabled):

Screen Shot 2020-08-06 at 5 38 09 PM

But when I do a hard refresh, or paste the URL into a different tab, it goes away, so I'm chalking it up to Felix's explanation.

rvantonder

comment created time in 19 hours

PR opened sourcegraph/sourcegraph

remove replacer service (take 2)

<!-- Reminder: Have you updated the changelog and relevant docs (user docs, architecture diagram, etc) ? -->

+8 -1523

0 comment

45 changed files

pr created time in 20 hours

delete branch sourcegraph/sourcegraph

delete branch : rvt/replacer-again

delete time in 20 hours

PR closed sourcegraph/sourcegraph

remove replacer service (take 2)

<!-- Reminder: Have you updated the changelog and relevant docs (user docs, architecture diagram, etc) ? -->

+8268 -5092

3 comments

162 changed files

rvantonder

pr closed time in 20 hours

create barnchsourcegraph/sourcegraph

branch : rvt/replacer-agane

created branch time in 20 hours

PR opened sourcegraph/sourcegraph

remove replacer service (take 2)

<!-- Reminder: Have you updated the changelog and relevant docs (user docs, architecture diagram, etc) ? -->

+8268 -5092

0 comment

162 changed files

pr created time in 20 hours

create barnchsourcegraph/sourcegraph

branch : rvt/replacer-again

created branch time in 20 hours

issue commentsourcegraph/sourcegraph

Type error in search code on `frontend`

(and thanks for flagging)

beyang

comment created time in a day

issue closedsourcegraph/sourcegraph

Type error in search code on `frontend`

Just saw this stack trace on one of the frontend pods, which died:

t=2020-08-06T21:33:47+0000 lvl=dbug msg="TRACE HTTP" method=POST url=/.api/graphql?DefinitionAndHover routename="graphql: DefinitionAndHover" trace="#tracing_not_enabled_for_this_request_add_?trace=1_to_url_to_enable" userAgent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.105 Safari/537.36" user=10218 xForwardedFor=108.162.219.47 written=57 code=200 duration=28.796113ms graphql_error=false
t=2020-08-06T21:33:47+0000 lvl=dbug msg="TRACE HTTP" method=HEAD url=/github.com/golang/text@HEAD/-/raw routename=raw trace="#tracing_not_enabled_for_this_request_add_?trace=1_to_url_to_enable" userAgent=Go-http-client/2.0 user=0 xForwardedFor=162.158.78.167 written=0 code=200 duration=7.158321ms graphql_error=false
t=2020-08-06T21:33:47+0000 lvl=info msg="serving GraphQL request" name=unknown user=danzh2010 source=browser
logging complete query for unnamed GraphQL request above name=unknown user=danzh2010 source=browser:
QUERY
-----

            
        fragment SearchResults on Search {
            results {
                __typename
                results {
                    ... on FileMatch {
                        __typename
                        file {
                            path
                            commit {
                                oid
                            }
                        }
                        repository {
                            name
                        }
                        symbols {
                            name
                            kind
                            location {
                                resource {
                                    path
                                }
                                range {
                                    start {
                                        line
                                        character
                                    }
                                    end {
                                        line
                                        character
                                    }
                                }
                            }
                        }
                        lineMatches {
                            lineNumber
                            offsetAndLengths
                        }
                    }
                }
            }
        }
    
            
        fragment FileLocal on Search {
            results {
                __typename
                results {
                    ... on FileMatch {
                        symbols {
                            fileLocal
                        }
                    }
                }
            }
        }
    
            query CodeIntelSearch($query: String!) {
                search(query: $query) {
                    ...SearchResults
                    ...FileLocal
                }
            }
        

VARIABLES
---------
map[query:^factory$ type:symbol patternType:regexp case:yes file:\.(c|cc|cpp|cxx|hh|h|hpp|ino|m)$ repo:^github.com/envoyproxy/envoy$@432ee807210907d769c10de7af2e775d23502f36]

t=2020-08-06T21:33:47+0000 lvl=dbug msg="TRACE HTTP" method=POST url=/.api/updates routename=updatecheck trace="#tracing_not_enabled_for_this_request_add_?trace=1_to_url_to_enable" userAgent=Go-http-client/2.0 user=0 xForwardedFor=172.69.35.74 written=0 code=204 duration=37.295058ms graphql_error=false
t=2020-08-06T21:33:47+0000 lvl=dbug msg="TRACE HTTP" method=POST url=/.api/graphql routename="graphql: unknown" trace="#tracing_not_enabled_for_this_request_add_?trace=1_to_url_to_enable" userAgent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.105 Safari/537.36" user=10218 xForwardedFor=108.162.219.47 written=597 code=200 duration=112.725392ms graphql_error=false
t=2020-08-06T21:33:48+0000 lvl=dbug msg="serving GraphQL request" name=Extensions user= source=browser
panic: interface conversion: query.QueryInfo is *query.OrdinaryQuery, not *query.AndOrQuery

goroutine 783743 [running]:
github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend.(*searchResolver).getExactFilePatterns(0xc032dda320, 0xc04f985580)
	github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend/search_results.go:1928 +0x193
github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend.(*searchResolver).sortResults(0xc032dda320, 0x2c8b520, 0xc04f985580, 0x0, 0x0, 0x0)
	github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend/search_results.go:1919 +0x58
github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend.(*searchResolver).doResults(0xc032dda320, 0x2c8b5a0, 0xc008744060, 0x1b7d228, 0x4, 0x0, 0x0, 0x0)
	github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend/search_results.go:1844 +0xf26
github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend.(*searchResolver).Suggestions.func5(0x2c8b5a0, 0xc008744060, 0x0, 0x0, 0x0, 0x0, 0x0)
	github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend/search_suggestions.go:276 +0x13a
github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend.(*searchResolver).Suggestions.func6(0xc04f985540, 0x2c8b5e0, 0xc024f21710, 0xc004612db0, 0xc05f275380, 0xc032dda320, 0xc05f275340)
	github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend/search_suggestions.go:311 +0xc5
created by github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend.(*searchResolver).Suggestions
	github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend/search_suggestions.go:307 +0x572

@rvantonder is this up your alley? Seems like it's associated with the user query parsing code.

closed time in a day

beyang

issue commentsourcegraph/sourcegraph

Type error in search code on `frontend`

@beyang fix should be live after I merged https://github.com/sourcegraph/deploy-sourcegraph-dot-com/pull/3151. If not, bounce pods?

beyang

comment created time in a day

issue commentsourcegraph/sourcegraph

search.largeFiles not working

Based on this sharing similar traits as the issue in #12347 (which is: large text files on the unindexed search path), it may be the same root cause which is: syntect, the syntax highlighting service, can't deal with it. In comment https://github.com/sourcegraph/sourcegraph/issues/12347#issuecomment-669711123 I showed some of the logs that pop up on the frontend when this happens.

Knowing whether their instance reports in the frontend, something like:

frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: <STUFF> error: http://localhost:9238: HSS worker timeout while serving request

(i.e., search for "HSS worker timeout), or anything reported by the syntect server like:

 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout

would help confirm or eliminate this as the cause.

In general, if we can grab some logs (Frontend probably likely the most useful) when search for this file could be helpful.

dadlerj

comment created time in a day

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha c97612c821b3cd7b7c2a5bbd9debb3974b387ce4

search: fix panic calling getExactFilePatterns for unguarded glob flag (#12807)

view details

push time in a day

delete branch sourcegraph/sourcegraph

delete branch : rvt/fix-suggestion-panic

delete time in a day

PR merged sourcegraph/sourcegraph

search: fix panic calling getExactFilePatterns for unguarded glob flag

The getExactFilePatterns function should be guarded by the getBoolPtr(settings.SearchGlobbing, false) flag (not or clause). Prior to this fix, the suggestions code can trigger a type assertion error on this path.

+1 -1

0 comment

1 changed file

rvantonder

pr closed time in a day

issue commentsourcegraph/sourcegraph

Type error in search code on `frontend`

@beyang https://github.com/sourcegraph/sourcegraph/pull/12807

beyang

comment created time in a day

PR opened sourcegraph/sourcegraph

search: fix panic calling getExactFilePatterns for unguarded glob flag

The getExactFilePatterns function should be guarded by the getBoolPtr(settings.SearchGlobbing, false) flag (not or clause). Prior to this fix, the suggestions code can trigger a type assertion error on this path.

+1 -1

0 comment

1 changed file

pr created time in a day

create barnchsourcegraph/sourcegraph

branch : rvt/fix-suggestion-panic

created branch time in a day

issue commentsourcegraph/sourcegraph

Type error in search code on `frontend`

panic: interface conversion: query.QueryInfo is *query.OrdinaryQuery, not *query.AndOrQuery

Yeah this is something I can looking into. Looks like an incorrect type assertion added recently.

beyang

comment created time in a day

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 All notable changes to Sourcegraph are documented in this file. - Saved search emails now include a link to the user's saved searches page. [#11651](https://github.com/sourcegraph/sourcegraph/pull/11651) - Campaigns can now be synced using GitLab webhooks. [#12139](https://github.com/sourcegraph/sourcegraph/pull/12139) - Configured `observability.alerts` can now be tested using a GraphQL endpoint, `triggerObservabilityTestAlert`. [#12532](https://github.com/sourcegraph/sourcegraph/pull/12532)+- `content:` filters can now be negated (`-content:`) for literal and regular expression patterns on indexed repositories.+- `NOT`, a new query operator, is now available as an alternative to `-` on supported keywords.

Also, this only works on indexed repos (worth stating in the change log and operator section)

stefanhengl

comment created time in a day

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 Returns results for files containing matches on the left _and_ right side of the  Returns file content matching either on the left or right side, or both (set union). The number of results reports the number of matches of both strings. +| Operator | Example |+| --- | --- |+| `not`, `NOT` | [`panic not file:main.go lang:go`](https://sourcegraph.com/search?q=panic+not+file:main.go+lang:go&patternType=literal),++`NOT <keyword>:` is equivalent to `-<keyword>:`. `NOT` can only stand before negatable keywords, such as `file`, `content`, `lang`, `repohasfile`, and `repo`. +

Aside: not 'something' type:file probably doesn't work right now. type:XXX is a bit of a pain, since sometimes it is an alias (e.g., that would be -file:something), but sometimes it doesn't act exactly like an alias, and in general I don't feel confident about what it does (especially not on sourcegraph.com's default repos). So: we could write a transformer to support the previous case, but to be safe, I'm in favor of adding a validation check thatnotis not used in conjunction with anytype:XXX` term.

stefanhengl

comment created time in a day

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 The following keywords can be used on all searches (using [RE2 syntax](https://g | **repogroup:group-name** <br> _alias: g_ | Only include results from the named group of repositories (defined by the server admin). Same as using a repo: keyword that matches all of the group's repositories. Use repo: unless you know that the group exists. | | | **file:regexp-pattern** <br> _alias: f_ | Only include results in files whose full path matches the regexp. | [`file:\.js$ httptest`](https://sourcegraph.com/search?q=file:%5C.js%24+httptest) <br> [`file:internal/ httptest`](https://sourcegraph.com/search?q=file:internal/+httptest) | | **-file:regexp-pattern** <br> _alias: -f_ | Exclude results from files whose full path matches the regexp. | [`file:\.js$ -file:test http`](https://sourcegraph.com/search?q=file:%5C.js%24+-file:test+http) |-| **content:"pattern"** | Explicitly override the [search pattern](#search-pattern-syntax). Useful for explicitly delineating the pattern to search for if it clashes with other parts of the query. | [`repo:sourcegraph content:"repo:sourcegraph"`](https://sourcegraph.com/search?q=repo:sourcegraph+content:"repo:sourcegraph"&patternType=literal) |+| **content:"pattern"** | Set the search pattern with a dedicated parameter. Useful when searching literally for a string that may conflict with the [search pattern syntax](#search-pattern-syntax). | [`repo:sourcegraph content:"repo:sourcegraph"`](https://sourcegraph.com/search?q=repo:sourcegraph+content:"repo:sourcegraph"&patternType=literal) |+| **-content:"pattern"** | Exclude results from files whose content matches the pattern. Note: `-content` is currently only supported for literal and regexp patterns on indexed repositories. | [`file:Dockerfile alpine -content:alpine:latest`](https://sourcegraph.com/search?q=file:Dockerfile+alpine+-content:alpine:latest&patternType=literal) |

Because this is particular to the migrateParser option that we're not shipping yet (I.e., this entry in the table has a constraint), I would, for now, now include it in this table, and restrict to talking about it below in the operators section.

stefanhengl

comment created time in a day

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 All notable changes to Sourcegraph are documented in this file. - Saved search emails now include a link to the user's saved searches page. [#11651](https://github.com/sourcegraph/sourcegraph/pull/11651) - Campaigns can now be synced using GitLab webhooks. [#12139](https://github.com/sourcegraph/sourcegraph/pull/12139) - Configured `observability.alerts` can now be tested using a GraphQL endpoint, `triggerObservabilityTestAlert`. [#12532](https://github.com/sourcegraph/sourcegraph/pull/12532)+- `content:` filters can now be negated (`-content:`) for literal and regular expression patterns on indexed repositories.+- `NOT`, a new query operator, is now available as an alternative to `-` on supported keywords.
- It is now possible to search for file content that excludes a term using the `NOT` operator. [TODO Stefan references a PR](some PR)
- `NOT`, a new query operator, is now available as an alternative to `-` on supported keywords `repo`, `file`, `content`, `lang`, `repohasfile`. [TODO Stefan references a PR](some PR)

To be totally accurate, we should add for both of these entries:

when the experimental migrateParser: true option is set in site settings

(or similar).

I will remove this clause if I feel confident about turning it on for 3.19. Similar for the operator doc below.

stefanhengl

comment created time in a day

issue commentsourcegraph/sourcegraph

Search recommendations to add quotes is misleading.

Got it! Thanks for clarifying. That's probably a relatively quick fix (let's just not show things that aren't helpful in this context).

dadlerj

comment created time in a day

PR closed sourcegraph/sourcegraph

Improve AND operand performance

This PR changes the way the AND operator is evaluated by reducing the set of files to search into every time we process one operand. For each operand, every result is stored and used to indicate to the search engines which files to look into. Because search engines don't support repo+file search filters (AFAIK), but only files, it is important to intersect every operand result with the previous results to avoid false positives. Also, I tried to find a way to avoid running the query multiple times if it doesn't return enough results but I couldn't find a simple enough solution, so I kept the existing solution.

Fixes #11281

+139 -17

18 comments

5 changed files

asdine

pr closed time in a day

pull request commentsourcegraph/sourcegraph

Improve AND operand performance

Closing this PR. @stefanhengl will work on this functionality whenever we're ready, but not rebase off of this, and just use it for inspiration.

asdine

comment created time in a day

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 Returns results for files containing matches on the left _and_ right side of the  Returns file content matching either on the left or right side, or both (set union). The number of results reports the number of matches of both strings. +| Operator | Example |+| --- | --- |+| `not`, `NOT` | [`panic not file:main.go lang:go`](https://sourcegraph.com/search?q=panic+not+file:main.go+lang:go&patternType=literal),++`NOT <keyword>:` is equivalent to `-<keyword>:`. `NOT` can only stand before negatable keywords, such as `file`, `content`, `lang`, `repohasfile`, and `repo`. +

We should add that it is possible to search for not of a pattern (where content: isn't required). So two examples like:

`` panic not never panic and not never


I think we should mention both, since `panic not never` reads terse, which is nice, but also not super obvious unless you're a power user. `panic and not never` reads like english, and is equivalent, so is nice to mention.
stefanhengl

comment created time in a day

issue commentsourcegraph/sourcegraph

Diff search unreliable: times out fetching large repos. Search recommendations to add quotes is misleading.

@keegancsmith do we run diff searches on gitserver? I had a brief look at the code but not sure. Wondering whether there's fetching/eviction behavior for repos on the service that runs diff/commit searches that might affect this.

dadlerj

comment created time in a day

issue commentsourcegraph/sourcegraph

Timeout search recommendations in literal mode add quotes to searches

@dadlerj this is a problem with the repo fetching/caching :'( If you go try either of the original search, or the one with quotes, it won't time out and show results. If it does time out, wait like, 1 minute, and search again, and you'll see the expected results. We have existing issues related to this but not sure how intertwined they are with diff search. Will dig them up and cross ref.

dadlerj

comment created time in a day

issue commentsourcegraph/sourcegraph

Timeout search recommendations in literal mode add quotes to searches

assigning to myself to investigate

dadlerj

comment created time in a day

Pull request review commentsourcegraph/sourcegraph

search: document -content and NOT

 Only search files in the specified programming language, like `typescript` or     <tbody>       <tr class="r">         <td class="d"></td>+        <td class="d">+          <table class="r">+            <tbody>+              <tr class="r">+                <td class="ts"></td>+                <td class="d">&nbsp;</td>+                <td class="te"></td>+              </tr>+              <tr class="r">+                <td class="ks"></td>+                <td class="d"><code class="c">–</code></td>+                <td class="ke"></td>+              </tr>+              <tr class="r">+                <td class="ls"></td>+                <td class="d"><code class="c">NOT</code></td>+                <td class="le"></td>+              </tr>+            </tbody>+          </table>+        </td>

At some point I'll write a little utility to generate these (sorry about that) :-) Don't think we have any upcoming changes so thanks for doing this manually for now.

stefanhengl

comment created time in a day

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha f41a2adca8927c1cf6be8f23026d4abf96d2281f

search: perform same trailing paren heuristic as old parser in new parser (#12774)

view details

push time in a day

delete branch sourcegraph/sourcegraph

delete branch : rvt/fix-regex-paren-heuristic

delete time in a day

PR merged sourcegraph/sourcegraph

Perform same trailing paren heuristic as old parser in new parser

Fixes #12733 (see description there). Apparently I left a TODO for myself about this.

We can inline the heuristic logic in the parser, and that would be more efficient. But as a matter of organizing this code, I prefer to apply any heuristics as passes after parsing, where possible. Even though that means doing another pass on the query. So, I've implemented this as an additional transformer function.

+29 -7

0 comment

3 changed files

rvantonder

pr closed time in a day

issue closedsourcegraph/sourcegraph

Treat trailing parentheses in new parser similar to old parser

We have this autofix logic in the old parser that will escape trailing parens in a regex string. This isn't done in the new parser--the new parser instead requires the trailling ( to be escaped. When it is not escaped, it treats the input literally (bug).

Old:

Screen Shot 2020-08-04 at 9 35 12 PM

New (bug):

Screen Shot 2020-08-04 at 9 35 48 PM

Screen Shot 2020-08-04 at 9 35 32 PM

closed time in a day

rvantonder

Pull request review commentsourcegraph/sourcegraph

search: facilitate simple searches if globbing is active

 func reporevToRegex(value string) (string, error) { 	return value, nil } +var globSymbols = lazyregexp.New(`[][*?/]`)++func ContainsNoGlobSymbols(value string) bool {

Nit: is "syntax" better than "symbols", potentially? Or is "symbols" preferred in the context of the globbing language?

stefanhengl

comment created time in 2 days

Pull request review commentsourcegraph/sourcegraph

search: facilitate simple searches if globbing is active

 const ( 	FieldContent            = "content" 	FieldVisibility         = "visibility" +	FieldFileOffset = len(FieldFile) + 1

Nit: we want to depend as little as possible on searchquery.go or add new additions here. I think it's quite fine to just inline len(Fieldfile) + 1 in the single place this is used.

stefanhengl

comment created time in 2 days

Pull request review commentsourcegraph/sourcegraph

search: facilitate simple searches if globbing is active

 func isContextError(ctx context.Context, err error) bool { // // Note: Any new result types added here also need to be handled properly in search_results.go:301 (sparklines) type SearchResultResolver interface {+	searchResultURIGetter+ 	ToRepository() (*RepositoryResolver, bool) 	ToFileMatch() (*FileMatchResolver, bool) 	ToCommitSearchResult() (*commitSearchResultResolver, bool) 	ToCodemodResult() (*codemodResultResolver, bool) -	// SearchResultURIs returns the repo name and file uri respectiveley-	searchResultURIs() (string, string) 	resultCount() int32 } +type searchResultURIGetter interface {+	// SearchResultURIs returns the repo name and file uri respectively+	searchResultURIs() (string, string)+}

I'm probably missing something, why is it needed to wrap searchResultURIs in this interface?

stefanhengl

comment created time in 2 days

Pull request review commentsourcegraph/sourcegraph

search: facilitate simple searches if globbing is active

 func compareSearchResults(a, b SearchResultResolver) bool { 	if arepo == brepo { 		return afile < bfile 	}- 	return arepo < brepo }  func sortResults(r []SearchResultResolver) { 	sort.Slice(r, func(i, j int) bool { return compareSearchResults(r[i], r[j]) }) } +// compareSearchResultsAndOr is like compareSearchResults, but overrides sorting in alphabetical order if+// one of the filenames is contained in exactFilePatterns, in which case exact matches are sorted by+// length of their file path and then alphabetically.+func compareSearchResultsAndOr(a, b searchResultURIGetter, exactFilePatterns map[string]struct{}) bool {+	arepo, afile := a.searchResultURIs()+	brepo, bfile := b.searchResultURIs()++	if arepo == brepo {+		if exactFilePatterns == nil || len(exactFilePatterns) == 0 {+			return afile < bfile+		}+		_, aMatch := exactFilePatterns[filepath.Base(afile)]+		_, bMatch := exactFilePatterns[filepath.Base(bfile)]+		if aMatch || bMatch {+			if aMatch && bMatch {+				if len(afile) < len(bfile) {+					return true+				}+				if len(bfile) < len(afile) {+					return false+				}+				return afile < bfile+			}+			if aMatch {+				return true+			}+			return false+		}+		return afile < bfile+	}+	return arepo < brepo+}++// getExactFilePatterns returns the set of file patterns without glob syntax.+func (r *searchResolver) getExactFilePatterns() map[string]struct{} {+	m := map[string]struct{}{}+	query.VisitField(+		r.query.(*query.AndOrQuery).Query,+		query.FieldFile,+		func(value string, negated bool, annotation query.Annotation) {+			originalValue := r.originalQuery[annotation.Range.Start.Column+query.FieldFileOffset : annotation.Range.End.Column]+			if !negated && query.ContainsNoGlobSymbols(originalValue) {+				m[originalValue] = struct{}{}+			}+		})+	return m+}++// sortResultsAndOr is like sortResults, but fine tunes the comparison function by taking into account:+// - the original query+// - whether globbing is active+//+// If globbing is active and the original query contains exact file patterns (no glob syntax), results with+// exact matches will appear first within their repository.+func (r *searchResolver) sortResultsAndOr(ctx context.Context, rr []SearchResultResolver) {+	var exactPatterns map[string]struct{}+	if settings, err := decodedViewerFinalSettings(ctx); err != nil || getBoolPtr(settings.SearchGlobbing, false) {+		exactPatterns = r.getExactFilePatterns()+	}+	sort.Slice(rr, func(i, j int) bool { return compareSearchResultsAndOr(rr[i], rr[j], exactPatterns) })+}

I think it's fine to guard this behind the globbing option. If we decide not to use globbing, we can apply the same sorting idea to our current regular expressions right? Just confirming that if that happened, it would be straightforward to activate this functionality by removing the SearchGlobbing flag here (and assuming we implement a similar getExactFilePatterns for a mode where regex is enabled).

So basically, it's worth separating the parts where we rely on globbing to activate the sorting, versus the part that actually sorts/orders the results (compareSearchResultsAndOr) It seems like the code is laid out for that already.

stefanhengl

comment created time in 2 days

Pull request review commentsourcegraph/sourcegraph

search: facilitate simple searches if globbing is active

 func compareSearchResults(a, b SearchResultResolver) bool { 	if arepo == brepo { 		return afile < bfile 	}- 	return arepo < brepo }  func sortResults(r []SearchResultResolver) { 	sort.Slice(r, func(i, j int) bool { return compareSearchResults(r[i], r[j]) }) } +// compareSearchResultsAndOr is like compareSearchResults, but overrides sorting in alphabetical order if+// one of the filenames is contained in exactFilePatterns, in which case exact matches are sorted by+// length of their file path and then alphabetically.+func compareSearchResultsAndOr(a, b searchResultURIGetter, exactFilePatterns map[string]struct{}) bool {+	arepo, afile := a.searchResultURIs()+	brepo, bfile := b.searchResultURIs()++	if arepo == brepo {+		if exactFilePatterns == nil || len(exactFilePatterns) == 0 {+			return afile < bfile+		}+		_, aMatch := exactFilePatterns[filepath.Base(afile)]+		_, bMatch := exactFilePatterns[filepath.Base(bfile)]+		if aMatch || bMatch {+			if aMatch && bMatch {+				if len(afile) < len(bfile) {+					return true+				}+				if len(bfile) < len(afile) {+					return false+				}+				return afile < bfile+			}+			if aMatch {+				return true+			}+			return false+		}+		return afile < bfile+	}+	return arepo < brepo+}++// getExactFilePatterns returns the set of file patterns without glob syntax.+func (r *searchResolver) getExactFilePatterns() map[string]struct{} {+	m := map[string]struct{}{}+	query.VisitField(+		r.query.(*query.AndOrQuery).Query,+		query.FieldFile,+		func(value string, negated bool, annotation query.Annotation) {+			originalValue := r.originalQuery[annotation.Range.Start.Column+query.FieldFileOffset : annotation.Range.End.Column]+			if !negated && query.ContainsNoGlobSymbols(originalValue) {+				m[originalValue] = struct{}{}+			}+		})+	return m+}++// sortResultsAndOr is like sortResults, but fine tunes the comparison function by taking into account:+// - the original query+// - whether globbing is active+//+// If globbing is active and the original query contains exact file patterns (no glob syntax), results with+// exact matches will appear first within their repository.

Also thinking we can remove AndOr parts in these function names, it feels kinda meaningless (in theory it's not specific to AndOr queries right, we could have done it with existing queries? Maybe something like sortResultsExactMatchesOnTop but less verbose, can't think of something better right now.

stefanhengl

comment created time in 2 days

Pull request review commentsourcegraph/sourcegraph

search: facilitate simple searches if globbing is active

 func compareSearchResults(a, b SearchResultResolver) bool { 	if arepo == brepo { 		return afile < bfile 	}- 	return arepo < brepo }  func sortResults(r []SearchResultResolver) { 	sort.Slice(r, func(i, j int) bool { return compareSearchResults(r[i], r[j]) }) } +// compareSearchResultsAndOr is like compareSearchResults, but overrides sorting in alphabetical order if+// one of the filenames is contained in exactFilePatterns, in which case exact matches are sorted by+// length of their file path and then alphabetically.+func compareSearchResultsAndOr(a, b searchResultURIGetter, exactFilePatterns map[string]struct{}) bool {+	arepo, afile := a.searchResultURIs()+	brepo, bfile := b.searchResultURIs()++	if arepo == brepo {+		if exactFilePatterns == nil || len(exactFilePatterns) == 0 {+			return afile < bfile+		}+		_, aMatch := exactFilePatterns[filepath.Base(afile)]+		_, bMatch := exactFilePatterns[filepath.Base(bfile)]+		if aMatch || bMatch {+			if aMatch && bMatch {+				if len(afile) < len(bfile) {+					return true+				}+				if len(bfile) < len(afile) {+					return false+				}+				return afile < bfile+			}+			if aMatch {+				return true+			}+			return false+		}+		return afile < bfile+	}+	return arepo < brepo+}++// getExactFilePatterns returns the set of file patterns without glob syntax.+func (r *searchResolver) getExactFilePatterns() map[string]struct{} {+	m := map[string]struct{}{}+	query.VisitField(+		r.query.(*query.AndOrQuery).Query,+		query.FieldFile,+		func(value string, negated bool, annotation query.Annotation) {+			originalValue := r.originalQuery[annotation.Range.Start.Column+query.FieldFileOffset : annotation.Range.End.Column]+			if !negated && query.ContainsNoGlobSymbols(originalValue) {+				m[originalValue] = struct{}{}+			}+		})+	return m+}++// sortResultsAndOr is like sortResults, but fine tunes the comparison function by taking into account:+// - the original query+// - whether globbing is active+//+// If globbing is active and the original query contains exact file patterns (no glob syntax), results with+// exact matches will appear first within their repository.

"fine tunes" is a bit vague. Maybe a description like

sortResultsAndOr partitions a result set to matches that correspond to an exact file path match (ordered first in the resulting set) and matches that correspond to substring matches (ordered after exact matches).

I don't think that I captured what the function does accurately in my suggestion, I just think the words like "partition" and "order" probably conveys what's going on here a bit better.

stefanhengl

comment created time in 2 days

issue commentsourcegraph/sourcegraph

Search 504s

OK so after cloning locally and running a search, it seems these files are actually wrecking havoc on syntect server (not searcher)

<details> <summary>Expand for example logs</summary>

<pre> frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-go__star-001442__ppg-100__pg-02.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":1130,"incomplete_results":false,"items":[{"id":11162159,"node_id""… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-go__star-000279__ppg-100__pg-02.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":4119,"incomplete_results":false,"items":[{"id":13910780,"node_id""… 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 120: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 106: restarting due to timeout 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-c++__star-000141__ppg-100__pg-05.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":7232,"incomplete_results":false,"items":[{"id":98503256,"node_id""… 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-php__star-000069__ppg-100__pg-08.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":10663,"incomplete_results":false,"items":[{"id":1346804,"node_id""…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-php__star-000021__ppg-100__pg-05.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":28383,"incomplete_results":false,"items":[{"id":264199730,"node_i"…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-javascript__star-000038__ppg-100__pg-08.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":75790,"incomplete_results":false,"items":[{"id":162595548,"node_i"… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-go__star-000037__ppg-100__pg-05.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":16439,"incomplete_results":false,"items":[{"id":5313975,"node_id""… 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 120: EOF 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-python__star-000101__ppg-100__pg-10.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":26085,"incomplete_results":false,"items":[{"id":30264349,"node_id"…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-javascript__star-000211__ppg-100__pg-07.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":20122,"incomplete_results":false,"items":[{"id":54505054,"node_id"… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-java__star-000024__ppg-100__pg-04.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":47544,"incomplete_results":false,"items":[{"id":17690143,"node_id"… 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 120: EOF 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 120: 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 120: signal: killed 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 120: EOF 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-go__star-000021__ppg-100__pg-05.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":23608,"incomplete_results":false,"items":[{"id":204168441,"node_i"… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-java__star-001269__ppg-100__pg-08.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":2004,"incomplete_results":false,"items":[{"id":95203155,"node_id""… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-javascript__star-000597__ppg-100__pg-06.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":8359,"incomplete_results":false,"items":[{"id":40485527,"node_id""… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-java__star-001269__ppg-100__pg-07.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":1652,"incomplete_results":true,"items":[{"id":283325,"node_id":"M"… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-java__star-000042__ppg-100__pg-09.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":31952,"incomplete_results":false,"items":[{"id":121508984,"node_i"… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-javascript__star-000047__ppg-100__pg-06.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":64664,"incomplete_results":false,"items":[{"id":137706555,"node_i"… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-javascript__star-000033__ppg-100__pg-03.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":84013,"incomplete_results":false,"items":[{"id":218937429,"node_i"… 22:24:48 frontend | WARN syntax highlighting took longer than 3s, this could indicate a bug in Sourcegraph, filepath: api_response_dump/lang-javascript__star-002315__ppg-100__pg-04.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":2374,"incomplete_results":false,"items":[{"id":92652320,"node_id""… 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-python__star-000061__ppg-100__pg-10.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":39202,"incomplete_results":false,"items":[{"id":26034643,"node_id"…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-python__star-000110__ppg-100__pg-03.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":24225,"incomplete_results":false,"items":[{"id":102413247,"node_i"…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-php__star-000120__ppg-100__pg-09.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":6719,"incomplete_results":false,"items":[{"id":20015974,"node_id""…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-python__star-000055__ppg-100__pg-01.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":34631,"incomplete_results":true,"items":[{"id":139160363,"node_id"…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: started on port 46503 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: Configured for production. 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 92: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: => address: 0.0.0.0 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: => port: 46503 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: => log: critical 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: => workers: 12 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: => secret key: provided 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: => limits: forms = 32KiB, json* = 10MiB 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: => keep-alive: disabled 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: => tls: disabled 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 135: Rocket has launched from http://0.0.0.0:46503 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 92: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 92: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: restarting due to timeout 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 106: EOF 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 106: EOF 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 106: 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 106: signal: killed 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 106: EOF 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 106: EOF 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: started on port 33341 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: Configured for production. 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: => address: 0.0.0.0 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: => port: 33341 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: => log: critical 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: => workers: 12 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-python__star-000054__ppg-100__pg-08.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":43088,"incomplete_results":false,"items":[{"id":151285266,"node_i"…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-ruby__star-000021__ppg-100__pg-07.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":21461,"incomplete_results":false,"items":[{"id":26945890,"node_id"…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 frontend | ERROR syntax highlighting failed (this is a bug, please report it), filepath: api_response_dump/lang-python__star-000105__ppg-100__pg-01.json, repo_name: github.com/sourcegraph/ghdump, revision: 0b04aa0167b403f5fc7ed43ecf15a34e76eb1978, snippet: "{"total_count":25166,"incomplete_results":false,"items":[{"id":4200468,"node_id""…, error: http://localhost:9238: HSS worker timeout while serving request 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: => secret key: provided 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: => limits: forms = 32KiB, json* = 10MiB 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: => keep-alive: disabled 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: => tls: disabled 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 150: Rocket has launched from http://0.0.0.0:33341 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 77: signal: killed 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 92: EOF 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 92: 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 92: signal: killed 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 92: EOF 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 92: EOF 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: Configured for production. 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: => address: 0.0.0.0 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: => port: 43955 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: => log: critical 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: => workers: 12 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: => secret key: provided 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: => limits: forms = 32KiB, json* = 10MiB 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: => keep-alive: disabled 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: => tls: disabled 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: started on port 43955 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 164: Rocket has launched from http://0.0.0.0:43955 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: started on port 40397 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: Configured for production. 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: => address: 0.0.0.0 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: => port: 40397 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: => log: critical 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: => workers: 12 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: => secret key: provided 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: => limits: forms = 32KiB, json* = 10MiB 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: => keep-alive: disabled 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: => tls: disabled 22:24:48 syntect_server | 2020/08/06 05:24:48 worker 178: Rocket has launched from http://0.0.0.0:40397 </pre> </details>

Paging @slimsag :-) One of the example files in the log that breaks is https://github.com/sourcegraph/ghdump/blob/master/api_response_dump/lang-python__star-000105__ppg-100__pg-01.json.

nicksnyder

comment created time in 2 days

issue closedsourcegraph/sourcegraph

search: Support AND / OR / NOT predicate language in search queries

This the umbrella issue for reaching feature parity with OpenGrok. This is an OKR of the Core Services team in Q1 2020.

Progress:

  • [x] Design for and/or queries is tracked in #8346
  • [x] Implement predicates support for searching file content #8567
    • [x] Parser additions #8633 #8802 #8784 #8809
    • [x] Refactors for existing search #8566 #8565 #8564
    • [x] Heuristics to make search usable and robust #9762 #9816
    • [x] Basic query validation #9817 #9172 #9754

Q2:

  • [x] Refine query validation #9971
  • [x] Add NOT as syntactic sugar #9976
  • [x] Explore search support to non-content searches (i.e., repo, filepath, commit, ... searches) #9974, #11009

  • [ ] TODO: Checking and/or query usage to pings #9177

Related documents: RFC 94 Related issues: #7154, #4774, #1005,#8397, #636

closed time in 2 days

tsenart

issue commentsourcegraph/sourcegraph

search: Support AND / OR / NOT predicate language in search queries

Closing this--the tasks for Q2 scope are done. We'll introduce and/or expressions for general queries tracked in #11009 and prototyped in https://github.com/sourcegraph/sourcegraph/tree/rvt/nested-search-7777

tsenart

comment created time in 2 days

issue closedsourcegraph/sourcegraph

Implement and/or-expressions for file and repo search (AKA hierarchical search)

This task tracks initial exploration for expanding and/or queries to fully nested search. Wanted to start with this in 3.16, but don't think I'll have time.

Relates to #9712 and issues in #7823

closed time in 2 days

rvantonder

issue closedsourcegraph/sourcegraph

Remove restrictions on and/or query processing

We are currently conservative about triggering processing for an and/or query, because the new parser pipeline has not been stress tested yet. Basically, we will fallback to the older parser if the query does not contain and or or. The way we check this is very heuristic-y. On the one hand, it's a cheap check, but on the other, prone to corner cases that can get around it. Because the parser does simplifications, syntax like and is not exposed in the final query, so we can't inspect the parse tree to decide on whether the query contained and/or expressions.

We should remove or improve this check.

closed time in 2 days

rvantonder

issue commentsourcegraph/sourcegraph

Remove restrictions on and/or query processing

Parser migration is imminent, so this is no longer relevant.

rvantonder

comment created time in 2 days

issue closedsourcegraph/sourcegraph

search: additional parentheses heuristics

The heuristic in #9760 works well but only triggers when there are no explicit parentheses. In some cases, some explicit parentheses exist, so the heuristic doesn't trigger, as in,

("main(" and "name(") or testing count:1000 or count:1000 ("main(" and "name(") or testing or count:1000 "main(" and ("name(" or testing)

For these, what we could do for the heuristic instead is just parenthesize the pattern we think is reasonable (i.e., a contiguous string without a field:param value`). Thus:

("main(" and "name(") or testing count:1000 => (("main(" and "name(") or testing) count:1000

and

count:1000 "main(" and ("name(" or testing) => count:1000 ("main(" and ("name(" or testing))

closed time in 2 days

rvantonder

issue commentsourcegraph/sourcegraph

search: additional parentheses heuristics

All reasonable heuristics are implemented in previous iterations, closing.

rvantonder

comment created time in 2 days

issue closedsourcegraph/sourcegraph

Better grouped-expression heuristics

Usability of and/or queries is hampered by tight binding to filters. The query:

repo:^github\.com/sourcegraph/sourcegraph$ (a and b) or (c and d)

needs parentheses:

repo:^github\.com/sourcegraph/sourcegraph$ ((a and b) or (c and d))

as the hoist heuristic is not sufficient for more nested expressions. For future nested searches, the interpretation

(repo:^github\.com/sourcegraph/sourcegraph$ a and b) or (c and d)

is indeed what we want, but this is arguably less common than the former example. The right move might just be to remove tight binding to fields.

closed time in 2 days

rvantonder

issue commentsourcegraph/sourcegraph

Better grouped-expression heuristics

All reasonable heuristics are implemented in previous iterations, closing.

rvantonder

comment created time in 2 days

issue closedsourcegraph/sourcegraph

Add dedicated HTTP client for the replacer service

Needed for metrics #4240 and relates to TODO comment

closed time in 2 days

rvantonder

issue commentsourcegraph/sourcegraph

Add dedicated HTTP client for the replacer service

Closing--replacer is going away.

rvantonder

comment created time in 2 days

issue closedsourcegraph/sourcegraph

Promote visibility of structural search at customers

This is something I've been putting off for a long time and it really deserves attention. There are a bunch of customers that could benefit from this feature already, but it needs some initial momentum. This mostly involves coming up with example searches for conventions or buggy patterns.

closed time in 2 days

rvantonder

issue commentsourcegraph/sourcegraph

Promote visibility of structural search at customers

Closing in favor of #12767

rvantonder

comment created time in 2 days

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha 23e5f56bee7dbe274d3a763d9e18331967a27cbe

search: perform same trailing paren heuristic as old parser in new parser

view details

push time in 2 days

Pull request review commentsourcegraph/sourcegraph

wip

 func substituteConcat(nodes []Node, separator string) []Node { 	return new } +// TrailingParensToLiteral is a heuristic used in the context of regular+// expression search. It checks whether any pattern is annotated with a label+// HeusticDanglingParens. This label implies that the regular expression is not+// well-formed, for example, "foo.*bar(" or "foo(.*bar". As a special case for+// usability we escape a trailing parenthesis and treat it literally. Any other+// forms are ignored, and will likely not pass validation.+func TrailingParensToLiteral(nodes []Node) []Node {

cf. autofix for the older parser, which this function partly replaces. I'm not going to add that to a comment, because the old logic will be removed soon enough.

rvantonder

comment created time in 2 days

PR opened sourcegraph/sourcegraph

wip

Fixes #12733 (see description there). Apparently I left a TODO for myself about this.

We can inline the heuristic logic in the parser, and that would be more efficient. But as a matter of organizing this code, I prefer to apply any heuristics as passes after parsing, where possible. Even though that means doing another pass on the query. So, I've implemented this as an additional transformer function.

+29 -7

0 comment

3 changed files

pr created time in 2 days

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha ea8942cae40db90459a370541de6048aa3e38915

wip

view details

push time in 2 days

create barnchsourcegraph/sourcegraph

branch : rvt/fix-regex-paren-heuristic

created branch time in 2 days

issue openedsourcegraph/sourcegraph

add structural search showcase page

I'll add a search page dedicated to some structural searches that point out the contexts it is useful for.

created time in 2 days

issue openedcomby-tools/comby

patterns inside multiline raw string delimiters sometimes don't work

cat file.tsx | comby ' query :[x]' 'test' -stdin -matcher .tsx for the file below should match.

import { DEFAULT_SOURCEGRAPH_URL, getAssetsURL } from '../../util/context'
import { initializeExtensions } from './extensions'

describe('Extensions controller', () => {
    it('Blocks GraphQL requests from extensions if they risk leaking private information to the public sourcegraph.com instance', () => {
        window.SOURCEGRAPH_URL = DEFAULT_SOURCEGRAPH_URL
        const { extensionsController } = initializeExtensions(
            {
                urlToFile: () => '',
                getContext: () => ({ rawRepoName: 'foo', privateRepository: true }),
            },
            {
                sourcegraphURL: DEFAULT_SOURCEGRAPH_URL,
                assetsURL: getAssetsURL(DEFAULT_SOURCEGRAPH_URL),
            },
            false
        )
        return expect(
            extensionsController.executeCommand({
                command: 'queryGraphQL',
                arguments: [
                    `
                        query ResolveRepo($repoName: String!) {
                            repository(name: $repoName) {
                                url
                            }
                        }
                    `,
                    { repoName: 'foo' },
                ],
            })
        ).rejects.toMatchObject({
            message:
                'A ResolveRepo GraphQL request to the public Sourcegraph.com was blocked because the current repository is private.',
        })
    })
})

created time in 2 days

issue openedcomby-tools/comby

Add # comments for graphql

created time in 2 days

issue commentsourcegraph/sourcegraph

Search 504s

But I would suspect the issue isn't the speed of "grep", but that we are struggling to fetch an archive of ghdump. Our searcher code doesn't do per line searching, but uses the same technique in rg of searching the whole body and hydrating in line breaks.

Whoops, guess my diagnosis is off then. I'll remember that part of our searcher now :-) The downloaded zip (at master) is only about 200MB , which is about the size of the Linux and we don't have any issues with that IIRC (git clone of ghdump is 2.6GB). So seems like there's still something else going on.

nicksnyder

comment created time in 2 days

issue commentsourcegraph/sourcegraph

Enable globbing for Sourcegraph org members

I think the sort of magic which responds to the results makes things quite complicated to understand. Displaying different results differently makes a lot of sense to me and can be done purely in the code that collects results so that sounds much more reliable.

Agree. What I meant to communicate is the order of best-effort assumptions we should make on the intent of a user's query (rather than explain it in terms of introspecting results). The assumption being what you mentioned: absence of glob syntax (like file:G) implies looking for results that match a substring (implicitly file:**G**), but exact matches should take precedence, since it has a higher likelihood of being the original intent.

Like you mention, we would definitely need a reference to root files in the form of / or ./, because if we always assume the intent of file:G is a pattern, then there is no way to disambiguate file:G from an exact match of file:/G versus file:a/b/G. For the sake of a simplifying assumption in the implementation, i.e., without introducing the / part, I feel like we can run a search for a query repo:foo file:bar as repo:**foo** file:**bar** and then decide to show either only exact matches, or exact matches first and fuzzy matches. Otherwise, we should spec out briefly the implementation of the extra work of adding /... and commit to that.

@poojaj-tech:

There's a few expressive regex use cases that globbing doesn't cover, will separate them out. Perhaps the new and/or syntax will help with expressing these.

This would be useful! Yeah, I fully expect that globbing should work in tandem with or operators (which is, I think the only real thing that regex gives that globbing doesn't). I'm aim to activate or operators on file and repo at around the same time, but it may lag by a week or so.

rvantonder

comment created time in 3 days

pull request commentsourcegraph/sourcegraph

search: filter out noisy inputs used in diff testing

Because it sounded like your previous suggestion was to exclude these before logging? Saves piping the noise over the wire. I plan to delete this tomorrow, if you're worried about overhead?

rvantonder

comment created time in 3 days

issue commentsourcegraph/sourcegraph

Enable globbing for Sourcegraph org members

@poojaj-tech I think that's a great idea! We could probably do this fairly easily: currently we sort all result matches (by repo by file). To put that at the top, we would just look in that result set for an exact path match, and prepend it to the list of results. That operation is probably cheap enough that it won't have a big perf hit.

I might not even mention the syntax difference for each. If this is such a common search, we would want to offer them a more convenient syntax (like you mention above) that gets them all the results by typing just foo.go.

Yes, agree. If we point out 'exact match' versus 'other matches' we don't even have to mention that we ran the equivalent of **foo.go**.

rvantonder

comment created time in 3 days

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha 77524f7c1f796dd45d7a0c275d27b4d8e48efce5

search: filter out noisy inputs used in diff testing (#12734)

view details

push time in 3 days

delete branch sourcegraph/sourcegraph

delete branch : rvt/log-more-queries

delete time in 3 days

PR merged sourcegraph/sourcegraph

search: filter out noisy inputs used in diff testing

So there's a lot of noisy searches logged (saved searches and symbol stuff), so now I'm acting on "maybe exclude type:symbol and log everything?". The previous inputs have been useful, but would now like more diversity that I can collect over a short time frame (so no throttling to 1 in 10 queries, there has not been that much in coming in a short time frame).

+4 -3

0 comment

1 changed file

rvantonder

pr closed time in 3 days

PR opened sourcegraph/sourcegraph

search: filter out noisy inputs used in diff testing

So there's a lot of noisy searches logged (saved searches and symbol stuff), so now I'm acting on "maybe exclude type:symbol and log everything?". These have been useful, but would now like more diversity that I can collect over a short time frame (so no throttling to 1 in 10 queries, there has not been that much in coming in a short time frame).

+4 -3

0 comment

1 changed file

pr created time in 3 days

create barnchsourcegraph/sourcegraph

branch : rvt/log-more-queries

created branch time in 3 days

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha 073908d9e487d4f8bf6dc7179f253af0d178f075

some convenience changes

view details

push time in 3 days

issue commentsourcegraph/sourcegraph

Feedback for Custom Search Pages and new Homepage Design

These seem done, let's close the issue @poojaj-tech?

poojaj-tech

comment created time in 3 days

issue commentsourcegraph/sourcegraph

search: multiple repohasfile inconsistency between repo search and text search

Going through some search-labeled issues. Since this is a validation issue which we can deal with in the new parser (and also support the or expression soon enough), I'm reassigning this to myself, unless you really want to deal with it :-)

keegancsmith

comment created time in 3 days

issue commentsourcegraph/sourcegraph

Duplicate results in or-operator merge operation

Fixed in #12531

rvantonder

comment created time in 3 days

issue closedsourcegraph/sourcegraph

add syntax highlighting for NOT

relates to #9976

We have implemented -content in #12412 and NOT <negatable field>:value in #9976. We should highlight NOT as keyword in the search bar.

closed time in 3 days

stefanhengl

issue commentsourcegraph/sourcegraph

add syntax highlighting for NOT

Sweet, closed by #12694

stefanhengl

comment created time in 3 days

issue openedsourcegraph/sourcegraph

Treat trailing parentheses in new parser similar to old parser

We have this autofix logic in the old parser that will escape trailing parens in a regex string. This isn't done in the new parser--the new parser instead requires the trailling ( to be escaped. When it is not escaped, it treats the input literally (bug).

Old:

Screen Shot 2020-08-04 at 9 35 12 PM

New (bug):

Screen Shot 2020-08-04 at 9 35 48 PM

Screen Shot 2020-08-04 at 9 35 32 PM

created time in 3 days

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha 9ed0ba67e01fd998942d87572a4254633a8f9c63

search: differential parser test utility

view details

push time in 3 days

PR opened sourcegraph/sourcegraph

search: differential parser test utility

Putting up for visibility for @keegancsmith and @stefanhengl. I don't think it's high enough value to merge, but it let's met document a couple of inputs to handle and might be of interest.

Run go build in the directory of this file. Then, you simply feed the string representation of a query in the first arg. The program will report a difference in valid/invalid queries (some differences here are to be expected, since the new parser fixes a bunch of issues as per https://github.com/sourcegraph/sourcegraph/issues/8780. For example,

./parser-testing ':'

New parser has weaker validation: old parser reports parse error at character 0: got TokenColon, want expr

These are "soft" cases that are good to know about, but not fundamentally problematic.

The structural equivalence comes down to just comparing a String representation of the field/values map produced by the old parser. In the utility, I have made this condition panic if there's a difference, because this way we can feed inputs from a fuzzer as well. I will first be doing manual queries based on input I collected, and then some fuzz inputs with a corpus based on the collection and test suite inputs.

Some current, known examples for triggering a difference are:

/parser-testing '/derp/'
panic: -old, +new:   string(
- 	`~"derp"`,
+ 	`~"/derp/"`,
  )
  • Since we don't support the /.../ regex syntax in the new parser yet.
./parser-testing 'asdf.*asdf('
panic: -old, +new:   string(
- 	`~"asdf.*asdf\\("`,
+ 	`:"asdf.*asdf("`,
  )
  • Since we don't have a heuristic that escapes a trailing ( in the new parser (search for autofix to see what we do in the old parser).
./parser-testing 'content:"yo"'
panic: -old, +new:   string(
- 	`content:"yo"`,
+ 	`:"yo"`,
  )
  • The old parser processes content in a different location in search_results, compared to the new parser that handles it at parse/validation time.
+38 -0

0 comment

1 changed file

pr created time in 3 days

create barnchsourcegraph/sourcegraph

branch : rvt/diff-test-parser

created branch time in 3 days

pull request commentsourcegraph/sourcegraph

Fix patternType addition in SearchResults

Ah, OK. Thanks for clarifying.

lguychard

comment created time in 3 days

push eventsourcegraph/sourcegraph

Rijnard van Tonder

commit sha 8b3e06bfb4af0b5c0881b7a490af3cd2a4c75d50

web: highlight search operators and/or/not (#12694)

view details

push time in 3 days

delete branch sourcegraph/sourcegraph

delete branch : rvt/syntax-highlight-operators

delete time in 3 days

more