profile
viewpoint
Erik Seliger eseliger @sourcegraph Bremen, Germany

codecov/sourcegraph-codecov 50

See code coverage information from Codecov on GitHub, Sourcegraph, and other tools.

eseliger/angular-auto-grow-input 0

This directive allows your inputs to grow as soon as user types. The input's width always fit the text user typed in the input.

eseliger/bootstrap 0

Native AngularJS (Angular) directives for Bootstrap. Smaller footprint (20kB gzipped), no 3rd party JS dependencies (jQuery, bootstrap JS) required. Please read the README.md file before submitting an issue!

eseliger/bundlesize 0

Keep your bundle size in check

eseliger/circleci-2.0-beta-docker-example 0

An example project for building docker images on CircleCI 2.0 Beta

eseliger/core 0

UI-Router Core: Framework agnostic, State-based routing for JavaScript Single Page Apps

eseliger/DefinitelyTyped 0

The repository for high quality TypeScript type definitions.

eseliger/docker-apgdiff 0

Docker image for apgdiff 🐳

eseliger/docker-oracle-12c 0

:whale: Docker image with Oracle Database 12c on board

eseliger/favicons 0

Favicons generator for Node.js

PR opened sourcegraph/sourcegraph

Add storybooks for file diffs and improve ugly states campaigns debt

For work on campaigns interfaces, I needed those as a storybook as well. While working on it, I realized two things:

  • We are inconsistent with the badging for moved files, so I added a badge there as well.
  • We always show the diff for deleted files by default, but most codehosts hide those by default (for good reasons, I think), so I added that functionality to ours as well.
+309 -1

0 comment

3 changed files

pr created time in 7 hours

create barnchsourcegraph/sourcegraph

branch : es/file-diff-story

created branch time in 7 hours

Pull request review commentsourcegraph/sourcegraph

codeintel: Reduce correlation memory usage by 50%

 func correlateFromReader(r io.Reader, root string) (*State, error) { type wrappedState struct { 	*State 	dumpRoot            string-	unsupportedVertexes datastructures.IDSet+	unsupportedVertexes *datastructures.IDSet

Haha i think this could be another fun anecdote to write a blog post about at some point

efritz

comment created time in 9 hours

PR merged sourcegraph/sourcegraph

Remove globalThis polyfill debt

Seems to be a left-over

+0 -4

0 comment

2 changed files

eseliger

pr closed time in 15 hours

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 69cb08fe710dd52d4743a481f833c8189b5af9b3

Remove globalThis polyfill (#12024)

view details

push time in 15 hours

delete branch sourcegraph/sourcegraph

delete branch : es/remove-globalthis-poly

delete time in 15 hours

Pull request review commentsourcegraph/sourcegraph

Remove globalThis polyfill

-// Remove this file after upgrading to Node 12 (blocked on node-sass support)

I could imagine this working with team handles as well, like @sourcegraph/web

eseliger

comment created time in 16 hours

Pull request review commentsourcegraph/sourcegraph

Remove globalThis polyfill

-// Remove this file after upgrading to Node 12 (blocked on node-sass support)

It would be great to have a way to get reminded of these, maybe through a specific comment format(?)

// debt(eseliger, 3.18): This is deprecated, remove this file once 3.18 is out.

// debt(eseliger, 01-08-2020): Check if node 12 is out, if so remove this file.
eseliger

comment created time in 16 hours

PR opened sourcegraph/sourcegraph

Remove globalThis polyfill

Seems to be a left-over

+0 -4

0 comment

2 changed files

pr created time in 16 hours

create barnchsourcegraph/sourcegraph

branch : es/remove-globalthis-poly

created branch time in 17 hours

push eventsourcegraph/sourcegraph

renovate[bot]

commit sha 88a6c593279d22614eabde9d8ae330f954cb5c17

Update dependency sass-loader to v9 (#11994) Co-authored-by: Renovate Bot <bot@renovateapp.com>

view details

push time in 17 hours

delete branch sourcegraph/sourcegraph

delete branch : renovate/sass-loader-9.x

delete time in 17 hours

PR merged sourcegraph/sourcegraph

Update dependency sass-loader to v9 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
sass-loader devDependencies major ^9.0.2 source code search for "sass-loader"

Release Notes

<details> <summary>webpack-contrib/sass-loader</summary>

v9.0.2

Compare Source

v9.0.1

Compare Source

v9.0.0

Compare Source

⚠ BREAKING CHANGES
  • minimum supported Nodejs version is 10.13
  • prefer sass (dart-sass) by default, it is strongly recommended to migrate on sass (dart-sass)
  • the prependData option was removed in favor the additionalData option, see docs
  • when the sourceMap is true, sassOptions.sourceMap, sassOptions.sourceMapContents, sassOptions.sourceMapEmbed, sassOptions.sourceMapRoot and sassOptions.omitSourceMapUrl will be ignored.
Features
  • pass the loader context to custom importers under the this.webpackLoaderContext property (#​853) (d487683)
  • supports for process.cwd() resolution logic by default (#​837) (0c8d3b3)
  • supports for SASS-PATH env variable resolution logic by default (#​836) (8376179)
  • supports for the sass property for the exports field from package.json (conditional exports, for more information read docs)
Bug Fixes
  • avoid different content on different os (#​832) (68dd278)
  • resolution logic when the includePaths option used was improved (#​827) (cbe5ad4)
  • resolution logic for file:// scheme was improved (17832fd)
  • resolution logic for absolute paths and server relative URLs was improved
  • source maps generation was improved
8.0.2 (2020-01-13)
Bug Fixes
8.0.1 (2020-01-10)
Bug Fixes

</details>


Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled by config. Please merge this manually once you are satisfied.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about this update again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+15 -26

9 comments

2 changed files

renovate[bot]

pr closed time in 17 hours

pull request commentsourcegraph/sourcegraph

Update dependency sass-loader to v9

Hm very annoying. I guess we can let renovate rebase this PR then..

renovate[bot]

comment created time in 17 hours

pull request commentsourcegraph/sourcegraph

Update dependency sass-loader to v9

Webpack build is 11s slower on this branch than latest commit on master (1min 36s vs 1min 43s)

I guess that's hard to measure, as we don't have any caches filled on this branch yet 🤔 But isn't the difference from 1.36 to 1.43 7s? 😁 Still worth investigating though.

It caused a Percy diff 🤔

Hm. Seems like everything is shifted down a bit..

renovate[bot]

comment created time in 17 hours

pull request commentsourcegraph/sourcegraph

Update dependency sass-loader to v9

According to the readme, dart sass is faster than libsass. However, the javascript version of dart sass is a little slower than libsass, but I didn't feel that was any problematic on my local dev environment, it felt quite fast 🤔

renovate[bot]

comment created time in 17 hours

pull request commentsourcegraph/sourcegraph

Update dependency sass-loader to v9

I updated our codebase to use sass instead of node-sass, as per this packages' advice. So no more compile errors or recompile needed on node upgrades 🎊

renovate[bot]

comment created time in 17 hours

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 8c95e4049e7046180511c23402b5e5e1aaebf698

Deduplicate packages

view details

push time in 17 hours

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha e31881c55f60b3deb59e4433e92bf4d3524a1426

Fix compilation

view details

push time in 17 hours

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 5d124a87c828d7f16075d0e965ef853f44516dbe

Swap node-sass in favor of sass

view details

push time in 17 hours

push eventsourcegraph/sourcegraph

renovate[bot]

commit sha 8f18d3308517fbf15bfe24add28be67169339217

Update dependency graphiql to v1 (#11984) Co-authored-by: Renovate Bot <bot@renovateapp.com>

view details

push time in 18 hours

delete branch sourcegraph/sourcegraph

delete branch : renovate/graphiql-1.x

delete time in 18 hours

PR merged sourcegraph/sourcegraph

Update dependency graphiql to v1 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
graphiql dependencies major ^1.0.3 source code search for "graphiql"

Release Notes

<details> <summary>graphql/graphiql</summary>

v1.0.3

Compare Source

v1.0.2

Compare Source

v1.0.1

Compare Source

v1.0.0

Compare Source

</details>


Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled by config. Please merge this manually once you are satisfied.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about this update again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+261 -90

2 comments

2 changed files

renovate[bot]

pr closed time in 18 hours

pull request commentsourcegraph/sourcegraph

Update dependency graphiql to v1

Tried manually, everything seems to work.

renovate[bot]

comment created time in 18 hours

Pull request review commentsourcegraph/sourcegraph

codeintel: Reduce correlation memory usage by 50%

 func correlateFromReader(r io.Reader, root string) (*State, error) { type wrappedState struct { 	*State 	dumpRoot            string-	unsupportedVertexes datastructures.IDSet+	unsupportedVertexes *datastructures.IDSet

I think it's vertices

efritz

comment created time in 18 hours

Pull request review commentsourcegraph/sourcegraph

update GraphQL API to remove unneeded namespace param, add applyCampaign test

 func (r *Resolver) ChangesetSpecByID(ctx context.Context, id graphql.ID) (graphq  func (r *Resolver) CreateCampaign(ctx context.Context, args *graphqlbackend.CreateCampaignArgs) (graphqlbackend.CampaignResolver, error) { 	var err error-	tr, ctx := trace.New(ctx, "Resolver.CreateCampaign", fmt.Sprintf("Namespace %s, CampaignSpec %s", args.Namespace, args.CampaignSpec))+	tr, ctx := trace.New(ctx, "Resolver.CreateCampaign", fmt.Sprintf("CampaignSpec %s", args.CampaignSpec)) 	defer func() { 		tr.SetError(err) 		tr.Finish() 	}()-	user, err := db.Users.GetByCurrentAuthUser(ctx)-	if err != nil {-		return nil, errors.Wrapf(err, "%v", backend.ErrNotAuthenticated)-	}--	// 🚨 SECURITY: Only site admins may create a campaign for now.-	if !user.SiteAdmin {-		return nil, backend.ErrMustBeSiteAdmin-	}--	campaign := &campaigns.Campaign{-		Name:     "TODO: not blank",-		AuthorID: user.ID,-	}--	switch relay.UnmarshalKind(args.Namespace) {-	case "User":-		err = relay.UnmarshalSpec(args.Namespace, &campaign.NamespaceUserID)-	case "Org":-		err = relay.UnmarshalSpec(args.Namespace, &campaign.NamespaceOrgID)-	default:-		err = errors.Errorf("Invalid namespace %q", args.Namespace)-	}--	if err != nil {-		return nil, err-	}--	svc := ee.NewService(r.store, r.httpFactory)-	err = svc.CreateCampaign(ctx, campaign)-	if err != nil {-		return nil, err-	} -	// TODO: This mutation is not done.-	return &campaignResolver{store: r.store, httpFactory: r.httpFactory, Campaign: campaign}, nil+	// TODO(sqs): Implement createCampaign when we've implemented applyCampaign and are happy about+	// how it works.+	return nil, errors.New("createCampaign is not yet implemented (use applyCampaign instead)")

I could imagine this code won't change, but only the code in the service layer, so not sure we need to get rid of it here, but your call

sqs

comment created time in 2 days

pull request commentsourcegraph/sourcegraph

update campaigns docs to reflect new flow

Oh sorry, I missed that 🤦‍♂️ yes, resolves it :)

sqs

comment created time in 2 days

Pull request review commentsourcegraph/about

Schedule interviews according to candidate preferences

 We want to mimimize the bias in our hiring process, so we will try to avoid sche  ## Scheduling -We are flexible when it comes to scheduling interviews because we are all-remote and we don't need to schedule all interviews back-to-back (unlike typical onsite interviews at other companies).+We are flexible when it comes to scheduling interviews because we are all-remote and we don't need to schedule all interviews back-to-back (unlike typical onsite interviews at other companies). Some candiates like spreading interviews out over multiple days and others prefer to batch them as much as possible to get it over with. *Ask the candidate what their preferences are and then try to accomidate those preferences as much as possible (given interviewer availability)*.
We are flexible when it comes to scheduling interviews because we are all-remote and we don't need to schedule all interviews back-to-back (unlike typical onsite interviews at other companies). Some candiates like spreading interviews out over multiple days and others prefer to batch them as much as possible to get it over with. *Ask the candidate what their preferences are and then try to accommodate those preferences as much as possible (given interviewer availability)*.
nicksnyder

comment created time in 2 days

Pull request review commentsourcegraph/sourcegraph

update campaigns docs to reflect new flow

+{+  "$schema": "http://json-schema.org/draft-07/schema#",+  "title": "CampaignTemplate",+  "description": "CampaignTemplate describes how to produce a CampaignSpec.",+  "type": "object",+  "additionalProperties": false,+  "properties": {+    "fooscripts": {+      "description": "A list of scripts that are executed to produce changesets.",+      "type": "array",+      "items": {"$ref": "#/definitions/Script"}+    }+  },+    "definitions": {+      "Script": {+        "description": "A script that is executed to produce changesets.",+        "type": "object",+        "additionalProperties": false,+        "properties": {+          "type": {"type":"string", "enum": ["docker", "command"]},+

just making it valid json

          "type": {"type":"string", "enum": ["docker", "command"]}
sqs

comment created time in 2 days

issue commentsourcegraph/sourcegraph

Campaigns: 3.18 Tracking Issue

Weekly Update July 6 2020

cc @nicksnyder

After Adam realized the work on GitLab support is probably going to be less than we estimated it to be, he chimed in last week on discussions around the new campaigns workflow. We still have a lot of questions and underspecified requirements to fill in this list, which I am going to work on for the first half of the week. On Thursday, we are having a team sync to make a call whether we want to take a slightly different path, which we think will make the state of resources in the backend much more explicit. It's a tradeoff for complexity and getting campaigns right, and we decided to favor getting it right in todays sync. Hence, we don't have a definite release date for campaigns right now. We anticipated the August iteration in the past weeks, but decided it will be more valuable to get it right on day one of the GA version of campaigns, rather than rushing it. Given that, we don't think that we will merge our tracking PR for this iteration. We still have Gitlab support and improvements to changeset tracking for unsupported codehosts on our plate for this iteration, so we still have some very valuable additions to the campaigns product anyways. In my opinion, the biggest unknown is time right now. It's likely that we are not merging in the new workflow this iteration, but rather incubate it for a little longer. We don't have a complete list of tasks to be done for the campaigns workflow revamp, and the scope of it changes daily while thinking about the use-cases and API design for it, so once we have that in place, work should return to a much more structured approach, getting rid of the big issues that have large estimates and no concrete tasks defined.

As the past weeks, we are still in need of a PM, because Quinn has too much on his plate and shouldn't need to be concerned of all the ongoings in campaigns.


@nicksnyder As this is my first time writing a team report, please feel free to ping me when you are missing context or I left questions unanswered, so I can follow-up on those.

mrnugget

comment created time in 2 days

issue commentsourcegraph/sourcegraph

Campaigns: 3.18 Tracking Issue

Weekly update: 2020-06-29 to 2020-07-03

Last week

I worked on making changesets for unsupported codehosts work in Sourcegraph and the src-cli. And finished up the spike work on triaging the paginated search API for campaigns usage. I also tinkered around with the new API and raised some questions which will be up for discussion in todays sync (write-up coming this week).

This week

I am going to work on clearing out the remaining questions for the campaigns workflow, and start implementation work where possible.

mrnugget

comment created time in 3 days

issue closedsourcegraph/sourcegraph

Use paginated search in src-cli to fetch repositories for campaigns

After running into https://github.com/sourcegraph/sourcegraph/issues/10540 and experimenting with alternate forms of querying in https://github.com/sourcegraph/src-cli/pull/226 I think we should investigate whether paginated search can solve our problems.

I want to do a spike to see whether it solves our problem and could work for us in campaigns, because the known limitations of the current paginated search implementation don't seem relevant to campaigns (since we're not (yet) interested in file matches, but only want repositories).

Docs:

  • https://docs.sourcegraph.com/api/graphql/search#sourcegraph-3-9-experimental-paginated-search
  • https://docs.sourcegraph.com/dev/architecture/search-pagination

closed time in 3 days

mrnugget

issue commentsourcegraph/sourcegraph

Use paginated search in src-cli to fetch repositories for campaigns

That's the other idea we had, right? That we do a search but have something like a "onlyReturnUniqueRepos: true" parameter, right?

Yes, exactly.

Report

Why are we in campaigns interested in paginated search?

In order to determine the matching repositories for a scopeQuery, we have to iterate over all search results and get the repository of that match. That can potentially yield hundreds of thousands and even more results, that we all need to parse and query from the search api, which often results in memory problems or timeouts. To reduce pressure, we wanted to investigate if the paginated search can help us here, by, say, only querying a 1000 results at a time.

What did we do to evaluate it?

Implemented a prototype in src-cli, evaluated the following documentation:

  • https://docs.sourcegraph.com/api/graphql/search#sourcegraph-3-9-experimental-paginated-search
  • https://docs.sourcegraph.com/dev/architecture/search-pagination

and took performance/memory measurements on the search, to see whether it has a positive impact.

What was the outcome?

Turns out paginated search doesn't work for all kinds of search queries. Hence, we need to educate people to use type:file as much as possible, and need to fall-back to non-paginated search for all other queries. In addition, the compute overhead for pagination can be quite high apparently, causing the same memory consumption and much higher overall compute time, given we need to fetch many pages.

Based on the outcome, does it make sense for campaigns to switch to paginated search?

Not at this point in time. It would only work when a user searches using type:file and then it would still require lot's of resources still, making it not an easy win in my opinion. Tweaking the per page parameter to a performant value can also depend on the search query, which makes it hard to come up with a generalized solution. If the search API were to return the repos that matched, we would likely not need that approach anyways.

If not, what would it take for paginated search to be a viable option?

(This is from the perspective of someone who only read the above mentioned documentation, but didn't dive into the code)

  • Support for all search query types would be helpful.
  • Less pressure on the search backend, that ideally only has to compute the amount of results that we actually ask for.

Given this was a spike, I am going to close this, as the spike work has been done.

mrnugget

comment created time in 3 days

Pull request review commentsourcegraph/sourcegraph

Add a basic implementation of the applyCampaign mutation

 func (s *Store) GetCampaign(ctx context.Context, opts GetCampaignOpts) (*campaig 	return &c, nil } -var getCampaignsQueryFmtstr = `+var getCampaignsQueryFmtstrPre = ` -- source: enterprise/internal/campaigns/store.go:GetCampaign SELECT-  id,-  name,-  description,-  branch,-  author_id,-  namespace_user_id,-  namespace_org_id,-  created_at,-  updated_at,-  changeset_ids,-  patch_set_id,-  closed_at+  campaigns.id,+  campaigns.name,+  campaigns.description,+  campaigns.branch,+  campaigns.author_id,+  campaigns.namespace_user_id,+  campaigns.namespace_org_id,+  campaigns.created_at,+  campaigns.updated_at,+  campaigns.changeset_ids,+  campaigns.patch_set_id,+  campaigns.closed_at,+  campaigns.campaign_spec_id FROM campaigns+`++var getCampaignsQueryFmtstrPost = ` WHERE %s LIMIT 1 `  func getCampaignQuery(opts *GetCampaignOpts) *sqlf.Query { 	var preds []*sqlf.Query 	if opts.ID != 0 {-		preds = append(preds, sqlf.Sprintf("id = %s", opts.ID))+		preds = append(preds, sqlf.Sprintf("campaigns.id = %s", opts.ID)) 	}  	if opts.PatchSetID != 0 {-		preds = append(preds, sqlf.Sprintf("patch_set_id = %s", opts.PatchSetID))+		preds = append(preds, sqlf.Sprintf("campaigns.patch_set_id = %s", opts.PatchSetID))+	}++	if opts.CampaignSpecID != 0 {+		preds = append(preds, sqlf.Sprintf("campaigns.campaign_spec_id = %s", opts.CampaignSpecID))+	}++	if opts.NamespaceUserID != 0 {+		preds = append(preds, sqlf.Sprintf("campaigns.namespace_user_id = %s", opts.NamespaceUserID))+	}++	if opts.NamespaceOrgID != 0 {+		preds = append(preds, sqlf.Sprintf("campaigns.namespace_org_id = %s", opts.NamespaceOrgID)) 	}  	if len(preds) == 0 { 		preds = append(preds, sqlf.Sprintf("TRUE")) 	} -	return sqlf.Sprintf(getCampaignsQueryFmtstr, sqlf.Join(preds, "\n AND "))+	var joinClause string+	if opts.CampaignSpecName != "" {

should there be a protection for opts.CampaignSpecName != "" && opts.NamespaceUserID + opts.NamespaceOrgID == 0?

mrnugget

comment created time in 6 days

pull request commentsourcegraph/sourcegraph

update docs for new campaign flow

From here:

Yes, I think we need to switch to spec-only publishing of changesets. We don't want to add an extra place where the user's desired state lives (i.e., by adding a publishChangeset mutation and flipping a bit on a changeset when they call that mutation).

That makes me wonder whether we should add a "close flag" to the spec as well. The desired state would be closed: true, so if we stick to this model strictly, I think we would need to do it as well. However, I'm not super sure we want the UI to basically be read-only completely. Just to gradually roll-out a campaign by publishing one after the other, it feels quite burdensome to always pull out an editor, set a boolean flag on one of the changesets and submit it via src-cli. I do see that this is how Kubernetes work and we wanted to loosely inspire our API by that, but I am unsure whether it would make it harder to use. When looking into the future, I would imagine there is server-side execution implemented and I can 1. create my manifest in the UI using monaco and 2. pick from a curated list of pre-baked campaigns to make my code a bit healthier. Given that, I would either need to create a file locally to track that, or have that file "tracked" somewhere on the Sourcegraph side and be able to edit it from the UI. For Kubernetes, however, there is no such thing as "one single source of truth manifest" outside of the k8s api. I can patch the manifest, without knowing the whole original file, I can edit it from another location and have an outdated manifest in my repo and so on. Given that, we can maybe pick this up as well. Update the spec in the DB from the UI by allowing "Publish" (for example) using a button, which then updates the spec. I can in return get the manifest by src campaign get <id> -o yaml to update my local (repo) state.

sqs

comment created time in 6 days

Pull request review commentsourcegraph/sourcegraph

update docs for new campaign flow

 type Mutation {     # repository permissions and syncs them to Sourcegraph, so that the current permissions apply to     # the user's operations on Sourcegraph.     scheduleUserPermissionsSync(user: ID!): EmptyResponse!++    #+    # CAMPAIGNS+    #++    # Create a campaign from a campaign spec and locally computed changeset specs. If a campaign in+    # the same namespace with the same name already exists, an error is returned. The newly created+    # campaign is returned.+    createCampaign(+        # The campaign's namespace (either a user or organization).+        namespace: ID!++        # The campaign spec that describes the desired state of the campaign.+        campaignSpec: ID!+    ): Campaign!++    # Create or update a campaign from a campaign spec and locally computed changeset specs. If no+    # campaign exists in the namespace with the name given in the campaign spec, a campaign will be+    # created. Otherwise, the existing campaign will be updated. The campaign is returned.+    applyCampaign(+        # The campaign's namespace (either a user or organization).+        namespace: ID!++        # The campaign spec that describes the new desired state of the campaign.+        campaignSpec: ID!++        # If set, return an error if the campaign identified using the namespace and campaignSpec+        # parameters does not match the campaign with this ID. This lets callers use a stable ID+        # that refers to a specific campaign during an edit session (and is not susceptible to+        # conflicts if the underlying campaign is moved to a different namespace, renamed, or+        # deleted).+        ensureCampaign: ID+    ): Campaign!++    # Move a campaign to a different namespace, or rename it in the current namespace.+    moveCampaign(campaign: ID!, newName: String, newNamespace: ID): Campaign!++    # Close a campaign.+    closeCampaign(+        campaign: ID!+        # Whether to close the changesets associated with this campaign on their respective code+        # hosts. "Close" means the appropriate final state on the code host (e.g., "closed" on+        # GitHub and "declined" on Bitbucket Server).+        closeChangesets: Boolean = false+    ): Campaign!++    # Delete a campaign. A deleted campaign is completely removed and can't be un-deleted. The+    # campaign's changesets are kept as-is; to close them, use the closeCampaign mutation first.+    deleteCampaign(campaign: ID!): EmptyResponse++    # Upload a changeset spec that will be used in a future update to a campaign. The changeset spec+    # is stored and can be referenced by its ID in the applyCampaign mutation. Just uploading the+    # changeset spec does not result in changes to the campaign or any of its changesets; you need+    # to call applyCampaign to use it.+    #+    # You can use this mutation to upload large changeset specs (e.g., containing large diffs) in+    # individual HTTP requests. Then, in the eventual applyCampaign call, you just refer to the+    # changeset specs by their IDs. This lets you avoid problems when updating large campaigns where+    # a large HTTP request body (e.g., with many large diffs in the changeset specs) would be+    # rejected by the web server/proxy or would be very slow.+    #+    # The returned ChangesetSpec is immutable and expires after a certain period of time (if not+    # used in a call to applyCampaign), which can be queried on ChangesetSpec.expiresAt.+    createChangesetSpec(+        # The raw changeset spec (as JSON).+        changesetSpec: String!+    ): ChangesetSpec!++    # Create a campaign spec that will be used to create a campaign (with the createCampaign+    # mutation), to update to a campaign (with the applyCampaign mutation), or to preview either+    # operation (with the campaignDelta query).+    #+    # The returned CampaignSpec is immutable and expires after a certain period of time (if not used+    # in a call to applyCampaign), which can be queried on CampaignSpec.expiresAt.+    createCampaignSpec(+        # The namespace (either a user or organization). A campaign spec can only be applied to (or+        # used to create) campaigns in this namespace.+        namespace: ID!++        # The campaign spec as YAML (or the equivalent JSON).+        campaignSpec: String!++        # Changeset specs that were locally computed and then uploaded using createChangesetSpec.+        changesetSpecs: [ID!]!+    ): CampaignSpec!++    # Enqueue the given changeset for high-priority syncing.+    syncChangeset(changeset: ID!): EmptyResponse!+}++# A changeset spec is an immutable description of the desired state of a changeset in a campaign. To+# create a changeset spec, use the createChangesetSpec mutation.+type ChangesetSpec implements Node {+    # The unique ID for a changeset spec.+    #+    # The ID is unguessable (i.e., long and randomly generated, not sequential). This is important+    # even though repository permissions also apply to viewers of changeset specs, because being+    # allowed to view a repository should not entitle a person to view all not-yet-published+    # changesets for that repository. Consider a campaign to fix a security vulnerability: the+    # campaign author may prefer to prepare all of the changesets in private so that the window+    # between revealing the problem and merging the fixes is as short as possible.+    id: ID!++    # TODO(sqs): add fields - the main requirements here will be imposed by the Campaign.changesets+    # field and the CampaignDelta field describing the delta to changesets.++    # The date, if any, when this changeset spec expires and is automatically purged. A changeset+    # spec never expires if its campaign spec has been applied.+    expiresAt: DateTime+}++# A campaign spec is an immutable description of the desired state of a campaign. To create a+# campaign spec, use the createCampaignSpec mutation.+type CampaignSpec implements Node {+    # The unique ID for a campaign spec.+    #+    # TODO(sqs): document permissions and ID guessability+    id: ID!++    # The original YAML or JSON input that was used to create this campaign spec.+    originalInput: String!++    # The parsed JSON value of the original input. If the original input was YAML, the YAML is+    # converted to the equivalent JSON.+    parsedInput: JSONValue!++    # The specs for changesets associated with this campaign.+    changesetSpecs: [ChangesetSpec!]!++    # The user who created this campaign spec (or null if the user no longer exists).+    creator: User++    # The date when this campaign spec was created.+    createdAt: DateTime++    # The namespace (either a user or organization) of the campaign spec.

should we rather prune specs that are in a removed namespace?

sqs

comment created time in 6 days

pull request commentsourcegraph/sourcegraph

Update dependency highlight.js to ^10.1.1

Updated!

renovate[bot]

comment created time in 6 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha f820e6a5c275b7ca0db6c2bad3bc8fcc7e0e28b0

Update snapshot

view details

push time in 6 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 6c2ddc330e720e5e050a84f6e501a3f404858a0d

Fix usage

view details

push time in 6 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 5748ed27e2dde8a8b7ae9b1507b7ca5e8084ada8

Remove bundled version of fromFetch

view details

push time in 6 days

push eventsourcegraph/sourcegraph

renovate[bot]

commit sha b1b4247160c75825d9eb975817be450d7934f73c

Update dependency @types/semver to v7.3.1 (#11852) Co-authored-by: Renovate Bot <bot@renovateapp.com>

view details

push time in 6 days

delete branch sourcegraph/sourcegraph

delete branch : renovate/semver-7.x

delete time in 6 days

PR merged sourcegraph/sourcegraph

Update dependency @types/semver to v7.3.1 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
@types/semver devDependencies minor 7.3.1 source code search for "@types/semver"

Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled due to failing status checks.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about this update again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+5 -5

0 comment

2 changed files

renovate[bot]

pr closed time in 6 days

push eventsourcegraph/sourcegraph

renovate[bot]

commit sha bb2d05eca127483b1119a26342412ff1959e2f45

Update dependency @types/webpack to v4.41.18 (#11853) Co-authored-by: Renovate Bot <bot@renovateapp.com>

view details

push time in 6 days

delete branch sourcegraph/sourcegraph

delete branch : renovate/webpack-4.x

delete time in 6 days

PR merged sourcegraph/sourcegraph

Update dependency @types/webpack to v4.41.18 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
@types/webpack devDependencies patch 4.41.18 source code search for "@types/webpack"

Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled due to failing status checks.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about this update again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+5 -5

1 comment

2 changed files

renovate[bot]

pr closed time in 6 days

push eventsourcegraph/sourcegraph

renovate[bot]

commit sha eb4595a3c7c60e4dd87a50cfe9caa3d0950cefde

Update dependency react-dom-confetti to ^0.1.4 (#11859) Co-authored-by: Renovate Bot <bot@renovateapp.com>

view details

push time in 6 days

delete branch sourcegraph/sourcegraph

delete branch : renovate/react-dom-confetti-0.x

delete time in 6 days

PR merged sourcegraph/sourcegraph

Update dependency react-dom-confetti to ^0.1.4 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
react-dom-confetti dependencies patch ^0.1.4 source code search for "react-dom-confetti"

Release Notes

<details> <summary>daniel-lundin/react-dom-confetti</summary>

v0.1.4

Compare Source

</details>


Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled by config. Please merge this manually once you are satisfied.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about this update again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+5 -5

2 comments

2 changed files

renovate[bot]

pr closed time in 6 days

delete branch sourcegraph/sourcegraph

delete branch : renovate/sanitize-html-1.x

delete time in 6 days

PR closed sourcegraph/sourcegraph

Update dependency sanitize-html to ^1.27.0 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
sanitize-html dependencies minor ^1.27.0 source code search for "sanitize-html"
@types/sanitize-html devDependencies patch 1.23.3 source code search for "@types/sanitize-html"

Release Notes

<details> <summary>apostrophecms/sanitize-html</summary>

v1.27.0

Compare Source

</details>


Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled by config. Please merge this manually once you are satisfied.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about these updates again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+10 -10

1 comment

2 changed files

renovate[bot]

pr closed time in 6 days

pull request commentsourcegraph/sourcegraph

Update dependency sanitize-html to ^1.27.0

No changes that affect our usage

renovate[bot]

comment created time in 6 days

push eventsourcegraph/sourcegraph

renovate[bot]

commit sha a4efdc7ac8f1d44154ec858b9531b0e5900f7140

Update dependency slugify to ^1.4.4 (#11862) Co-authored-by: Renovate Bot <bot@renovateapp.com>

view details

push time in 7 days

delete branch sourcegraph/sourcegraph

delete branch : renovate/slugify-1.x

delete time in 7 days

PR merged sourcegraph/sourcegraph

Update dependency slugify to ^1.4.4 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
slugify dependencies patch ^1.4.4 source code search for "slugify"

Release Notes

<details> <summary>simov/slugify</summary>

v1.4.4

Compare Source

v1.4.3

Compare Source

v1.4.2

Compare Source

v1.4.1

Compare Source

</details>


Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled by config. Please merge this manually once you are satisfied.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about this update again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+5 -5

1 comment

2 changed files

renovate[bot]

pr closed time in 7 days

push eventsourcegraph/sourcegraph

renovate[bot]

commit sha d92755e61a5172bc30dc5640aa06a5d1fd69a6d9

Update dependency stylelint to ^13.6.1 (#11863) Co-authored-by: Renovate Bot <bot@renovateapp.com>

view details

push time in 7 days

delete branch sourcegraph/sourcegraph

delete branch : renovate/stylelint-13.x

delete time in 7 days

PR merged sourcegraph/sourcegraph

Update dependency stylelint to ^13.6.1 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
stylelint (source) devDependencies patch ^13.6.1 homepage, source code search for "stylelint"

Release Notes

<details> <summary>stylelint/stylelint</summary>

v13.6.1

Compare Source

  • Fixed: max-empty-lines TypeError from inline comment with autofix and sugarss syntax (#​4821).
  • Fixed: property-no-unknown false positives for namespaced variables (#​4803).
  • Fixed: selector-type-no-unknown false positives for idents within ::part pseudo-elements (#​4828).

</details>


Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled due to failing status checks.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about this update again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+10 -10

2 comments

2 changed files

renovate[bot]

pr closed time in 7 days

push eventsourcegraph/sourcegraph

renovate[bot]

commit sha 50ff326e9bef08e61a899e530cd1dc712949a16d

Update dependency @sentry/browser to ^5.19.0 (#11842) Co-authored-by: Renovate Bot <bot@renovateapp.com>

view details

push time in 7 days

delete branch sourcegraph/sourcegraph

delete branch : renovate/sentry-monorepo

delete time in 7 days

PR merged sourcegraph/sourcegraph

Update dependency @sentry/browser to ^5.19.0 bot npm

This PR contains the following updates:

Package Type Update New value References Sourcegraph
@sentry/browser dependencies minor ^5.19.0 source code search for "@sentry/browser"

Release Notes

<details> <summary>getsentry/sentry-javascript</summary>

v5.19.0

Compare Source

  • [tracing] feat: Pick up sentry-trace in JS <meta/> tag (#​2703)
  • [react] feat: Expose eventId on ErrorBoundary component (#​2704)
  • [node] fix: Extract transaction from nested express paths correctly (#​2714)
  • [tracing] fix: Respect fetch headers (#​2712) (#​2713)
  • [tracing] fix: Check if performance.getEntries() exists (#​2710)
  • [tracing] fix: Add manual Location typing (#​2700)
  • [tracing] fix: Respect sample decision when continuing trace from header in node (#​2703)
  • [tracing] fix: All options of adding fetch headers (#​2712)
  • [gatsby] fix: Add gatsby SDK identifier (#​2709)
  • [gatsby] fix: Package gatsby files properly (#​2711)

v5.18.1

Compare Source

  • [react] feat: Update peer dependencies for react and react-dom (#​2694)
  • [react] ref: Change Profiler prop names (#​2699)

v5.18.0

Compare Source

  • [react] feat: Add @​sentry/react package (#​2631)
  • [react] feat: Add Error Boundary component (#​2647)
  • [react] feat: Add useProfiler hook (#​2659)
  • [core] feat: Export makeMain (#​2665)
  • [core] fix: Call bindClient when creating new Hub to make integrations work automatically (#​2665)
  • [gatsby] feat: Add @​sentry/gatsby package (#​2652)
  • [tracing] feat: Add scope.getTransaction to return a Transaction if it exists (#​2668)
  • [tracing] ref: Deprecate scope.setTransaction in favor of scope.setTransactionName (#​2668)
  • [core] ref: Rename whitelistUrls/blacklistUrls to allowUrls/denyUrls (#​2671)
  • [react] ref: Refactor Profiler to account for update and render (#​2677)
  • [apm] feat: Add ability to get span from activity using getActivitySpan (#​2677)
  • [apm] fix: Check if performance.mark exists before calling it (#​2680)
  • [tracing] feat: Add beforeNavigate option (#​2691)
  • [tracing] ref: Create navigation transactions using window.location.pathname instead of window.location.href (#​2691)

v5.17.0

Compare Source

  • [browser] feat: Support fetchParameters (#​2567)
  • [apm] feat: Report LCP metric on pageload transactions (#​2624)
  • [core] fix: Normalize Transaction and Span consistently (#​2655)
  • [core] fix: Handle DSN qs and show better error messages (#​2639)
  • [browser] fix: Change XHR instrumentation order to handle onreadystatechange breadcrumbs correctly (#​2643)
  • [apm] fix: Re-add TraceContext for all events (#​2656)
  • [integrations] fix: Change Vue interface to be inline with the original types (#​2634)
  • [apm] ref: Use startTransaction where appropriate (#​2644)

</details>


Renovate configuration

:date: Schedule: "on the 1st through 7th day of the month" in timezone America/Los_Angeles.

:vertical_traffic_light: Automerge: Disabled by config. Please merge this manually once you are satisfied.

:recycle: Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

:no_bell: Ignore: Close this PR and you won't be reminded about this update again.


  • [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

This PR has been generated by WhiteSource Renovate. View repository job log here.

+38 -38

1 comment

2 changed files

renovate[bot]

pr closed time in 7 days

Pull request review commentsourcegraph/codeintellify

feat: highlight visible references

 export function createHoverifier<C extends object, D, A>({             })     ) +    /**+     * For every position, emits an Observable with new values for the `documentHighlightsOrError` state.+     * This is a higher-order Observable (Observable that emits Observables).+     */+    const documentHighlightObservables: Observable<Observable<{+        eventType: SupportedMouseEvent | 'jump'+        dom: DOMFunctions+        target: HTMLElement+        adjustPosition?: PositionAdjuster<C>+        codeView: HTMLElement+        codeViewId: symbol+        scrollBoundaries?: HTMLElement[]+        documentHighlightsOrError?: typeof LOADING | DocumentHighlight[] | ErrorLike | null+        position?: HoveredToken & C+        part?: DiffPart+    }>> = resolvedPositions.pipe(+        map(({ position, codeViewId, ...rest }) => {+            if (!position) {+                return of({+                    documentHighlightsOrError: null,+                    position: undefined,+                    part: undefined,+                    codeViewId,+                    ...rest,+                })+            }+            // Get the document highlights for that position+            return toMaybeLoadingProviderResult(getDocumentHighlights(position)).pipe(+                catchError((error): [MaybeLoadingResult<ErrorLike>] => [{ isLoading: false, result: asError(error) }]),+                emitLoading<DocumentHighlight[] | ErrorLike, null>(LOADER_DELAY, null),+                map(documentHighlightsOrError => ({+                    ...rest,+                    codeViewId,+                    position,+                    documentHighlightsOrError,+                    part: position.part,+                })),+                // Do not emit anything after the code view this action came from got unhoverified+                takeUntil(allUnhoverifies.pipe(filter(unhoverifiedCodeViewId => unhoverifiedCodeViewId === codeViewId)))+            )+        }),+        share()+    )++    // Highlight the ranges returned by the document highlight provider+    subscription.add(+        documentHighlightObservables+            .pipe(+                switchMap(highlightObservable => highlightObservable),+                switchMap(({ documentHighlightsOrError, position, adjustPosition, codeView, part, ...rest }) => {+                    const highlights =+                        documentHighlightsOrError &&+                        documentHighlightsOrError !== LOADING &&+                        !isErrorLike(documentHighlightsOrError)+                            ? documentHighlightsOrError+                            : []++                    if (highlights.length === 0 || !position) {+                        return of({ adjustPosition, codeView, part, ...rest, positions: of<Position[]>([]) })+                    }++                    const positions: Observable<Position>[] = []+                    for (const { range } of highlights) {+                        let pos = { ...position, ...range.start }++                        // The requested position is is 0-indexed; the code here is currently 1-indexed+                        const { line, character } = pos+                        pos = { ...pos, line: line + 1, character: character + 1 }++                        positions.push(+                            adjustPosition+                                ? from(+                                      adjustPosition({+                                          codeView,+                                          direction: AdjustmentDirection.ActualToCodeView,+                                          position: {+                                              ...pos,+                                              part,+                                          },+                                      })+                                  )+                                : of(pos)+                        )+                    }++                    return of({+                        adjustPosition,+                        codeView,+                        part,+                        ...rest,+                        positions: combineLatest(positions),+                    })+                }),+                mergeMap(({ positions, codeView, dom, part }) =>+                    positions.pipe(+                        map(highlightedRanges =>+                            highlightedRanges.map(highlightedRange =>+                                highlightedRange+                                    ? getTokenAtPosition(codeView, highlightedRange, dom, part, tokenize)+                                    : undefined+                            )+                        ),+                        map(elements => ({ elements, codeView, dom, part }))+                    )+                )+            )+            .subscribe(({ codeView, elements }) => {+                // Ensure the previously highlighted range is not highlighted and the new highlightedRange (if any)+                // is highlighted.+                const currentHighlighteds = codeView.querySelectorAll('.document-highlight')

@felixfbecker do you agree a sourcegraph- prefix would be the way we want to push the above mentioned issue forward?

efritz

comment created time in 8 days

Pull request review commentsourcegraph/codeintellify

feat: highlight visible references

 export function createHoverifier<C extends object, D, A>({             })     ) +    /**+     * For every position, emits an Observable with new values for the `documentHighlightsOrError` state.+     * This is a higher-order Observable (Observable that emits Observables).+     */+    const documentHighlightObservables: Observable<Observable<{+        eventType: SupportedMouseEvent | 'jump'+        dom: DOMFunctions+        target: HTMLElement+        adjustPosition?: PositionAdjuster<C>+        codeView: HTMLElement+        codeViewId: symbol+        scrollBoundaries?: HTMLElement[]+        documentHighlightsOrError?: typeof LOADING | DocumentHighlight[] | ErrorLike | null+        position?: HoveredToken & C+        part?: DiffPart+    }>> = resolvedPositions.pipe(+        map(({ position, codeViewId, ...rest }) => {+            if (!position) {+                return of({+                    documentHighlightsOrError: null,+                    position: undefined,+                    part: undefined,+                    codeViewId,+                    ...rest,+                })+            }+            // Get the document highlights for that position+            return toMaybeLoadingProviderResult(getDocumentHighlights(position)).pipe(+                catchError((error): [MaybeLoadingResult<ErrorLike>] => [{ isLoading: false, result: asError(error) }]),+                emitLoading<DocumentHighlight[] | ErrorLike, null>(LOADER_DELAY, null),+                map(documentHighlightsOrError => ({+                    ...rest,+                    codeViewId,+                    position,+                    documentHighlightsOrError,+                    part: position.part,+                })),+                // Do not emit anything after the code view this action came from got unhoverified+                takeUntil(allUnhoverifies.pipe(filter(unhoverifiedCodeViewId => unhoverifiedCodeViewId === codeViewId)))+            )+        }),+        share()+    )++    // Highlight the ranges returned by the document highlight provider+    subscription.add(+        documentHighlightObservables+            .pipe(+                switchMap(highlightObservable => highlightObservable),+                switchMap(({ documentHighlightsOrError, position, adjustPosition, codeView, part, ...rest }) => {+                    const highlights =+                        documentHighlightsOrError &&+                        documentHighlightsOrError !== LOADING &&+                        !isErrorLike(documentHighlightsOrError)+                            ? documentHighlightsOrError+                            : []++                    if (highlights.length === 0 || !position) {+                        return of({ adjustPosition, codeView, part, ...rest, positions: of<Position[]>([]) })+                    }++                    const positions: Observable<Position>[] = []+                    for (const { range } of highlights) {+                        let pos = { ...position, ...range.start }++                        // The requested position is is 0-indexed; the code here is currently 1-indexed+                        const { line, character } = pos+                        pos = { ...pos, line: line + 1, character: character + 1 }++                        positions.push(+                            adjustPosition+                                ? from(+                                      adjustPosition({+                                          codeView,+                                          direction: AdjustmentDirection.ActualToCodeView,+                                          position: {+                                              ...pos,+                                              part,+                                          },+                                      })+                                  )+                                : of(pos)+                        )+                    }++                    return of({+                        adjustPosition,+                        codeView,+                        part,+                        ...rest,+                        positions: combineLatest(positions),+                    })+                }),+                mergeMap(({ positions, codeView, dom, part }) =>+                    positions.pipe(+                        map(highlightedRanges =>+                            highlightedRanges.map(highlightedRange =>+                                highlightedRange+                                    ? getTokenAtPosition(codeView, highlightedRange, dom, part, tokenize)+                                    : undefined+                            )+                        ),+                        map(elements => ({ elements, codeView, dom, part }))+                    )+                )+            )+            .subscribe(({ codeView, elements }) => {+                // Ensure the previously highlighted range is not highlighted and the new highlightedRange (if any)+                // is highlighted.+                const currentHighlighteds = codeView.querySelectorAll('.document-highlight')

Yeah, it's tracked here: https://github.com/sourcegraph/sourcegraph/issues/10595 but not done yet. But it struck my mind while reading it so I thought we could try to not introduce a new instance of it. But maybe it's not necessary, given someone will pick up https://github.com/sourcegraph/sourcegraph/issues/10595 eventually.

efritz

comment created time in 8 days

Pull request review commentsourcegraph/codeintellify

feat: highlight visible references

 export function createHoverifier<C extends object, D, A>({             })     ) +    /**+     * For every position, emits an Observable with new values for the `documentHighlightsOrError` state.+     * This is a higher-order Observable (Observable that emits Observables).+     */+    const documentHighlightObservables: Observable<Observable<{+        eventType: SupportedMouseEvent | 'jump'+        dom: DOMFunctions+        target: HTMLElement+        adjustPosition?: PositionAdjuster<C>+        codeView: HTMLElement+        codeViewId: symbol+        scrollBoundaries?: HTMLElement[]+        documentHighlightsOrError?: typeof LOADING | DocumentHighlight[] | ErrorLike | null+        position?: HoveredToken & C+        part?: DiffPart+    }>> = resolvedPositions.pipe(+        map(({ position, codeViewId, ...rest }) => {+            if (!position) {+                return of({+                    documentHighlightsOrError: null,+                    position: undefined,+                    part: undefined,+                    codeViewId,+                    ...rest,+                })+            }+            // Get the document highlights for that position+            return toMaybeLoadingProviderResult(getDocumentHighlights(position)).pipe(+                catchError((error): [MaybeLoadingResult<ErrorLike>] => [{ isLoading: false, result: asError(error) }]),+                emitLoading<DocumentHighlight[] | ErrorLike, null>(LOADER_DELAY, null),+                map(documentHighlightsOrError => ({+                    ...rest,+                    codeViewId,+                    position,+                    documentHighlightsOrError,+                    part: position.part,+                })),+                // Do not emit anything after the code view this action came from got unhoverified+                takeUntil(allUnhoverifies.pipe(filter(unhoverifiedCodeViewId => unhoverifiedCodeViewId === codeViewId)))+            )+        }),+        share()+    )++    // Highlight the ranges returned by the document highlight provider+    subscription.add(+        documentHighlightObservables+            .pipe(+                switchMap(highlightObservable => highlightObservable),+                switchMap(({ documentHighlightsOrError, position, adjustPosition, codeView, part, ...rest }) => {+                    const highlights =+                        documentHighlightsOrError &&+                        documentHighlightsOrError !== LOADING &&+                        !isErrorLike(documentHighlightsOrError)+                            ? documentHighlightsOrError+                            : []++                    if (highlights.length === 0 || !position) {+                        return of({ adjustPosition, codeView, part, ...rest, positions: of<Position[]>([]) })+                    }++                    const positions: Observable<Position>[] = []+                    for (const { range } of highlights) {+                        let pos = { ...position, ...range.start }++                        // The requested position is is 0-indexed; the code here is currently 1-indexed+                        const { line, character } = pos+                        pos = { ...pos, line: line + 1, character: character + 1 }++                        positions.push(+                            adjustPosition+                                ? from(+                                      adjustPosition({+                                          codeView,+                                          direction: AdjustmentDirection.ActualToCodeView,+                                          position: {+                                              ...pos,+                                              part,+                                          },+                                      })+                                  )+                                : of(pos)+                        )+                    }++                    return of({+                        adjustPosition,+                        codeView,+                        part,+                        ...rest,+                        positions: combineLatest(positions),+                    })+                }),+                mergeMap(({ positions, codeView, dom, part }) =>+                    positions.pipe(+                        map(highlightedRanges =>+                            highlightedRanges.map(highlightedRange =>+                                highlightedRange+                                    ? getTokenAtPosition(codeView, highlightedRange, dom, part, tokenize)+                                    : undefined+                            )+                        ),+                        map(elements => ({ elements, codeView, dom, part }))+                    )+                )+            )+            .subscribe(({ codeView, elements }) => {+                // Ensure the previously highlighted range is not highlighted and the new highlightedRange (if any)+                // is highlighted.+                const currentHighlighteds = codeView.querySelectorAll('.document-highlight')

I'd pull this out into a constant, and we should probably prefix it with sourcegraph- or similar

efritz

comment created time in 8 days

Pull request review commentsourcegraph/sourcegraph

Add createCampaignSpec/createChangesetSpec mutations

 type ChangesetSpecResolver interface {  	// TODO: More fields, see PR 	ExpiresAt() *DateTime++	// TODO: This is a hack so that a CampaignSpecResolver cannot be cast into

TIL: cool idea

mrnugget

comment created time in 8 days

delete branch sourcegraph/sourcegraph

delete branch : es/preview-unsupported

delete time in 8 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha b17f05a03f88bded232aa337e6a77159127e55fd

Implement support to preview unsupported patches (#11793)

view details

push time in 8 days

PR merged sourcegraph/sourcegraph

Implement support to preview unsupported patches campaigns

Closes #11193

image

Tasks from issue:

  • [x] Move the IsRepoSupported checks so that patches from unsupported repositories are accepted, but not published (checks need to be moved to the Service and possibly the ExecChangesetJobs function)
  • [x] In publishChangeset mutation we should return an error
  • [x] In publishCampaignChangesets mutation we should skip them
  • [x] We should add a Publishable: boolean property to Patch in the GraphQL API
  • [x] We need to remove the "is supported" check in src-cli

For the last task: Do we really need to? We currently have a flag to allow creating patches for unsupported codehosts, which helps here. Removing it would mean we default to that behavior and I think that more customers would want to omit them than would want to store them for future use by default.

+312 -46

8 comments

16 changed files

eseliger

pr closed time in 8 days

issue closedsourcegraph/sourcegraph

Allow previewing of changesets for unsupported code hosts

Right now, if you try to create a patch on a repo that is not supported (eg because it's on GitLab or another non-supported code host), it is silently discarded:

https://sourcegraph.com/github.com/sourcegraph/sourcegraph@615077bc91a6208bb60288fedba19a0adc5d6d1d/-/blob/enterprise/internal/campaigns/service.go#L82:17

This is surprising and prevents someone from trying out campaigns if their code host is not yet supported.

Proposed behavior: You can create patches on any repository (that you can view). In the list of patches, a patch that is against an unsupported repository will not have a publish button and instead will have an indication that it can't be published because the code host is not yet supported. If you try to publish them via the GraphQL API, the publish operation will fail with an error (as it does currently).

TODO(sqs): make a mock for this

closed time in 8 days

sqs

delete branch sourcegraph/src-cli

delete branch : es/unsupported-repo-ui

delete time in 8 days

push eventsourcegraph/src-cli

Erik Seliger

commit sha 522a55cfd4c1f5edbfe48a48e07abf2e56b3d93b

Visualize unsupported repos better (#236)

view details

push time in 8 days

PR merged sourcegraph/src-cli

Visualize unsupported repos better

In order to make more transparent why some repos are excluded, we now visualize better which repos are excluded because they aren't supported, and also display the hint on how to use them in non-verbose mode. In addition, up to 10 repo names will appear in the UI to get a better sense which are the unsupported ones.

image

+34 -23

0 comment

2 changed files

eseliger

pr closed time in 8 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 1f73503b770d7f28e2524d592762946b12a6ef7e

Reduce redundancy

view details

push time in 8 days

Pull request review commentsourcegraph/sourcegraph

Implement support to preview unsupported patches

 func computeCampaignUpdateDiff( 		if group, ok := byRepoID[j.RepoID]; ok { 			group.newPatch = j 		} else {-+			repo, err := db.Repos.Get(ctx, j.RepoID)+			if err != nil {+				return nil, err+			}

Cool, thanks for the pointer. Did that improvement.

eseliger

comment created time in 8 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 44ca9dafb14270fd938891c5d541065518c366f4

Improve performance

view details

push time in 8 days

push eventsourcegraph/src-cli

Erik Seliger

commit sha d9abb5e6eecbb095121a735051038d618a8fdd42

Address feedback

view details

push time in 8 days

Pull request review commentsourcegraph/sourcegraph

Implement support to preview unsupported patches

 func (s *Store) ProcessPendingChangesetJobs(ctx context.Context, process func(ct 		return false, errors.Wrap(err, "starting transaction") 	} 	defer tx.Done(&err)-	q := sqlf.Sprintf(getPendingChangesetJobQuery)+	supportedTypes := make([]*sqlf.Query, 0)+	for t := range campaigns.SupportedExternalServices {+		supportedTypes = append(supportedTypes, sqlf.Sprintf("%s", t))+	}+	q := sqlf.Sprintf(getPendingChangesetJobQuery, sqlf.Join(supportedTypes, ","))

Gitlab support is supposed to land in 3.18, so we might end up with unprocessed jobs against gitlab in the db right before the downgrade to 3.17 (if any customer would need to). Then, those jobs would be pulled and the behavior is undefined. But: I guess that was overambitious, so I removed that code, given it also makes the query faster.

eseliger

comment created time in 8 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 991873adea370e10e8759b3d03fb921ed2063135

Strengthen test

view details

push time in 9 days

pull request commentsourcegraph/sourcegraph

Implement support to preview unsupported patches

Related PR on src CLI: https://github.com/sourcegraph/src-cli/pull/236

eseliger

comment created time in 9 days

PR opened sourcegraph/src-cli

Visualize unsupported repos better

In order to make more transparent why some repos are excluded, we now visualize better which repos are excluded because they aren't supported, and also display the hint on how to use them in non-verbose mode. In addition, up to 10 repo names will appear in the UI to get a better sense which are the unsupported ones.

image

+34 -20

0 comment

2 changed files

pr created time in 9 days

create barnchsourcegraph/src-cli

branch : es/unsupported-repo-ui

created branch time in 9 days

Pull request review commentsourcegraph/sourcegraph

Implement support to preview unsupported patches

 func (s *Store) ProcessPendingChangesetJobs(ctx context.Context, process func(ct 		return false, errors.Wrap(err, "starting transaction") 	} 	defer tx.Done(&err)-	q := sqlf.Sprintf(getPendingChangesetJobQuery)+	supportedTypes := make([]*sqlf.Query, 0)+	for t := range campaigns.SupportedExternalServices {+		supportedTypes = append(supportedTypes, sqlf.Sprintf("%s", t))+	}+	q := sqlf.Sprintf(getPendingChangesetJobQuery, sqlf.Join(supportedTypes, ","))

A downgrade from 3.18 to 3.17 would have caused it. But I decided to remove it.

eseliger

comment created time in 9 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 90804d84559d4d954fd6aee11d5fac133c783b8d

Add test

view details

push time in 9 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 83c8eaec5bbcee8a7b250fa8937bbb7309679cd9

Improve performance

view details

push time in 9 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha f728554f28dcbb830acd7bc3281fe8216eabc1dc

Add check for update

view details

push time in 9 days

push eventsourcegraph/sourcegraph

Erik Seliger

commit sha 1b6988d563cdda56e087900020b3cfe469c963e8

Simplify implementation

view details

push time in 9 days

Pull request review commentsourcegraph/sourcegraph

Implement support to preview unsupported patches

 func (s *Store) ProcessPendingChangesetJobs(ctx context.Context, process func(ct 		return false, errors.Wrap(err, "starting transaction") 	} 	defer tx.Done(&err)-	q := sqlf.Sprintf(getPendingChangesetJobQuery)+	supportedTypes := make([]*sqlf.Query, 0)+	for t := range campaigns.SupportedExternalServices {+		supportedTypes = append(supportedTypes, sqlf.Sprintf("%s", t))+	}+	q := sqlf.Sprintf(getPendingChangesetJobQuery, sqlf.Join(supportedTypes, ","))

It's just super cautious, when there is a downgrade for example, so it doesn't pick up old jobs that target unsupported codehosts 🤔 For the sake of simplicity, we can probably remove it.

eseliger

comment created time in 9 days

Pull request review commentsourcegraph/sourcegraph

Implement support to preview unsupported patches

 func TestService(t *testing.T) { 		rs = append(rs, r) 	} +	awsCodeCommitRepoID := 4+	{+		r := testRepo(awsCodeCommitRepoID, extsvc.TypeAWSCodeCommit)+		r.Sources = map[string]*repos.SourceInfo{ext.URN(): {ID: ext.URN()}}+		rs = append(rs, r)+	}

I personally find it very hard to grasp which one does what, but if we want to keep that style, I can change it.

eseliger

comment created time in 9 days

Pull request review commentsourcegraph/sourcegraph

Add GraphQL resolvers for CampaignSpec/ChangesetSpec

 package resolvers  import (+	"context" 	"time"  	"github.com/graph-gophers/graphql-go"+	"github.com/graph-gophers/graphql-go/relay" 	"github.com/pkg/errors" 	"github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend"+	ee "github.com/sourcegraph/sourcegraph/enterprise/internal/campaigns"+	"github.com/sourcegraph/sourcegraph/internal/campaigns"+	"github.com/sourcegraph/sourcegraph/internal/errcode"+	"github.com/sourcegraph/sourcegraph/internal/httpcli" ) +func marshalCampaignSpecRandID(id string) graphql.ID {+	return relay.MarshalID("CampaignSpec", id)+}++func unmarshalCampaignSpecID(id graphql.ID) (campaignSpecRandID string, err error) {+	err = relay.UnmarshalSpec(id, &campaignSpecRandID)+	return+}+ var _ graphqlbackend.CampaignSpecResolver = &campaignSpecResolver{}  type campaignSpecResolver struct {+	store       *ee.Store+	httpFactory *httpcli.Factory++	campaignSpec *campaigns.CampaignSpec } -func (r *campaignSpecResolver) ID() (graphql.ID, error) {-	return "", errors.New("TODO: not implemented")+func (r *campaignSpecResolver) ID() graphql.ID {+	// 🚨 SECURITY: This needs to be the RandID! We can't expose the+	// sequential, guessable ID.+	return marshalCampaignSpecRandID(r.campaignSpec.RandID) }  func (r *campaignSpecResolver) OriginalInput() (string, error) {-	return "", errors.New("TODO: not implemented")+	return r.campaignSpec.RawSpec, nil }  func (r *campaignSpecResolver) ParsedInput() (graphqlbackend.JSONValue, error) {-	return graphqlbackend.JSONValue{}, errors.New("TODO: not implemented")+	return graphqlbackend.JSONValue{Value: r.campaignSpec.Spec}, nil }  func (r *campaignSpecResolver) ChangesetSpecs() ([]graphqlbackend.ChangesetSpecResolver, error) { 	return []graphqlbackend.ChangesetSpecResolver{}, errors.New("TODO: not implemented") } -func (r *campaignSpecResolver) Creator() (*graphqlbackend.UserResolver, error) {-	return nil, errors.New("TODO: not implemented")+func (r *campaignSpecResolver) Creator(ctx context.Context) (*graphqlbackend.UserResolver, error) {+	return graphqlbackend.UserByIDInt32(ctx, r.campaignSpec.UserID) } -func (r *campaignSpecResolver) Namespace() (*graphqlbackend.NamespaceResolver, error) {-	return nil, errors.New("TODO: not implemented")+func (r *campaignSpecResolver) Namespace(ctx context.Context) (*graphqlbackend.NamespaceResolver, error) {+	var (+		err error+		n   = &graphqlbackend.NamespaceResolver{}+	)++	if r.campaignSpec.NamespaceUserID != 0 {+		n.Namespace, err = graphqlbackend.UserByIDInt32(ctx, r.campaignSpec.NamespaceUserID)+	} else {+		n.Namespace, err = graphqlbackend.OrgByIDInt32(ctx, r.campaignSpec.NamespaceOrgID)+	}++	if errcode.IsNotFound(err) {+		return nil, nil+	}++	return n, err }  func (r *campaignSpecResolver) PreviewURL() (string, error) {-	return "", errors.New("TODO: not implemented")+	// TODO: this needs to take the namespace into account+	return "/campaigns/new?spec=" + string(r.ID()), nil

Not blocking, hence the approval :) But wanted to point it out as it struck my mind during the review.

mrnugget

comment created time in 9 days

Pull request review commentsourcegraph/sourcegraph

Add GraphQL resolvers for CampaignSpec/ChangesetSpec

 package resolvers  import (+	"context" 	"time"  	"github.com/graph-gophers/graphql-go"+	"github.com/graph-gophers/graphql-go/relay" 	"github.com/pkg/errors" 	"github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend"+	ee "github.com/sourcegraph/sourcegraph/enterprise/internal/campaigns"+	"github.com/sourcegraph/sourcegraph/internal/campaigns"+	"github.com/sourcegraph/sourcegraph/internal/errcode"+	"github.com/sourcegraph/sourcegraph/internal/httpcli" ) +func marshalCampaignSpecRandID(id string) graphql.ID {+	return relay.MarshalID("CampaignSpec", id)+}++func unmarshalCampaignSpecID(id graphql.ID) (campaignSpecRandID string, err error) {+	err = relay.UnmarshalSpec(id, &campaignSpecRandID)+	return+}+ var _ graphqlbackend.CampaignSpecResolver = &campaignSpecResolver{}  type campaignSpecResolver struct {+	store       *ee.Store+	httpFactory *httpcli.Factory++	campaignSpec *campaigns.CampaignSpec } -func (r *campaignSpecResolver) ID() (graphql.ID, error) {-	return "", errors.New("TODO: not implemented")+func (r *campaignSpecResolver) ID() graphql.ID {+	// 🚨 SECURITY: This needs to be the RandID! We can't expose the+	// sequential, guessable ID.+	return marshalCampaignSpecRandID(r.campaignSpec.RandID) }  func (r *campaignSpecResolver) OriginalInput() (string, error) {-	return "", errors.New("TODO: not implemented")+	return r.campaignSpec.RawSpec, nil }  func (r *campaignSpecResolver) ParsedInput() (graphqlbackend.JSONValue, error) {-	return graphqlbackend.JSONValue{}, errors.New("TODO: not implemented")+	return graphqlbackend.JSONValue{Value: r.campaignSpec.Spec}, nil }  func (r *campaignSpecResolver) ChangesetSpecs() ([]graphqlbackend.ChangesetSpecResolver, error) { 	return []graphqlbackend.ChangesetSpecResolver{}, errors.New("TODO: not implemented") } -func (r *campaignSpecResolver) Creator() (*graphqlbackend.UserResolver, error) {-	return nil, errors.New("TODO: not implemented")+func (r *campaignSpecResolver) Creator(ctx context.Context) (*graphqlbackend.UserResolver, error) {+	return graphqlbackend.UserByIDInt32(ctx, r.campaignSpec.UserID) } -func (r *campaignSpecResolver) Namespace() (*graphqlbackend.NamespaceResolver, error) {-	return nil, errors.New("TODO: not implemented")+func (r *campaignSpecResolver) Namespace(ctx context.Context) (*graphqlbackend.NamespaceResolver, error) {+	var (+		err error+		n   = &graphqlbackend.NamespaceResolver{}+	)++	if r.campaignSpec.NamespaceUserID != 0 {+		n.Namespace, err = graphqlbackend.UserByIDInt32(ctx, r.campaignSpec.NamespaceUserID)+	} else {+		n.Namespace, err = graphqlbackend.OrgByIDInt32(ctx, r.campaignSpec.NamespaceOrgID)+	}++	if errcode.IsNotFound(err) {+		return nil, nil+	}++	return n, err }  func (r *campaignSpecResolver) PreviewURL() (string, error) {-	return "", errors.New("TODO: not implemented")+	// TODO: this needs to take the namespace into account+	return "/campaigns/new?spec=" + string(r.ID()), nil

Can externalURL ever not be set? At least a fallback to a localhost address should always be set, no?

I know we are, but I think it would be more helpful, I would imagine I can jq the response and just pipe the URL to some other command without string concat from some locally set externalURL value 🤔

mrnugget

comment created time in 9 days

Pull request review commentsourcegraph/sourcegraph

Add GraphQL resolvers for CampaignSpec/ChangesetSpec

 package resolvers  import (+	"context" 	"time"  	"github.com/graph-gophers/graphql-go"+	"github.com/graph-gophers/graphql-go/relay" 	"github.com/pkg/errors" 	"github.com/sourcegraph/sourcegraph/cmd/frontend/graphqlbackend"+	ee "github.com/sourcegraph/sourcegraph/enterprise/internal/campaigns"+	"github.com/sourcegraph/sourcegraph/internal/campaigns"+	"github.com/sourcegraph/sourcegraph/internal/errcode"+	"github.com/sourcegraph/sourcegraph/internal/httpcli" ) +func marshalCampaignSpecRandID(id string) graphql.ID {+	return relay.MarshalID("CampaignSpec", id)+}++func unmarshalCampaignSpecID(id graphql.ID) (campaignSpecRandID string, err error) {+	err = relay.UnmarshalSpec(id, &campaignSpecRandID)+	return+}+ var _ graphqlbackend.CampaignSpecResolver = &campaignSpecResolver{}  type campaignSpecResolver struct {+	store       *ee.Store+	httpFactory *httpcli.Factory++	campaignSpec *campaigns.CampaignSpec } -func (r *campaignSpecResolver) ID() (graphql.ID, error) {-	return "", errors.New("TODO: not implemented")+func (r *campaignSpecResolver) ID() graphql.ID {+	// 🚨 SECURITY: This needs to be the RandID! We can't expose the+	// sequential, guessable ID.+	return marshalCampaignSpecRandID(r.campaignSpec.RandID) }  func (r *campaignSpecResolver) OriginalInput() (string, error) {-	return "", errors.New("TODO: not implemented")+	return r.campaignSpec.RawSpec, nil }  func (r *campaignSpecResolver) ParsedInput() (graphqlbackend.JSONValue, error) {-	return graphqlbackend.JSONValue{}, errors.New("TODO: not implemented")+	return graphqlbackend.JSONValue{Value: r.campaignSpec.Spec}, nil }  func (r *campaignSpecResolver) ChangesetSpecs() ([]graphqlbackend.ChangesetSpecResolver, error) { 	return []graphqlbackend.ChangesetSpecResolver{}, errors.New("TODO: not implemented") } -func (r *campaignSpecResolver) Creator() (*graphqlbackend.UserResolver, error) {-	return nil, errors.New("TODO: not implemented")+func (r *campaignSpecResolver) Creator(ctx context.Context) (*graphqlbackend.UserResolver, error) {+	return graphqlbackend.UserByIDInt32(ctx, r.campaignSpec.UserID) } -func (r *campaignSpecResolver) Namespace() (*graphqlbackend.NamespaceResolver, error) {-	return nil, errors.New("TODO: not implemented")+func (r *campaignSpecResolver) Namespace(ctx context.Context) (*graphqlbackend.NamespaceResolver, error) {+	var (+		err error+		n   = &graphqlbackend.NamespaceResolver{}+	)++	if r.campaignSpec.NamespaceUserID != 0 {+		n.Namespace, err = graphqlbackend.UserByIDInt32(ctx, r.campaignSpec.NamespaceUserID)+	} else {+		n.Namespace, err = graphqlbackend.OrgByIDInt32(ctx, r.campaignSpec.NamespaceOrgID)+	}++	if errcode.IsNotFound(err) {+		return nil, nil+	}++	return n, err }  func (r *campaignSpecResolver) PreviewURL() (string, error) {-	return "", errors.New("TODO: not implemented")+	// TODO: this needs to take the namespace into account+	return "/campaigns/new?spec=" + string(r.ID()), nil

I think in the API we should return absolute URLs, makes it easier for API consumers to link to the proper resource

mrnugget

comment created time in 9 days

Pull request review commentsourcegraph/sourcegraph

search: test fork:true is an alias for fork:yes

 var tests = []test{ 		Name:  `Global search, repo search by name, case yes, nonzero result`, 		Query: `repo:^github\.com/rvantonderp/adjust-go-wrk$ String case:yes count:1 stable:yes`, 	},+	{+		Name:  `True is an alias for yes is when fork is set`,
		Name:  `True is an alias for yes when fork is set`,
rvantonder

comment created time in 9 days

pull request commentsourcegraph/sourcegraph

Implement support to preview unsupported patches

Ah you're right, it's not on non verbose mode. what do you think about the below?

image

eseliger

comment created time in 9 days

pull request commentsourcegraph/sourcegraph

Implement support to preview unsupported patches

So, the question is: instead of silently dropping this, what options do we have?

This is fixed IMO: The src-cli lists unsupported repos, and suggests to use the flag to still include them: image

And once they hit the backend API, we will never discard them anymore, but keep them and display them as in the screenshot above, being publishable: false`. I think the ticket mentioned us dropping it silently on the backend, which we now don't do anymore.

eseliger

comment created time in 10 days

issue commentsourcegraph/sourcegraph

Use paginated search in src-cli to fetch repositories for campaigns

When there are more than that amount of matches, it can become a problem. But it pretty much already is a problem at that point because that search query will take a super long time and need lots of memory 🤔 I tried some big queries locally and wanted to grab 1000 results at once, and it pretty much takes the same time per page as the whole query does and still consumed around 11GB of memory. So as far as I understand that API, it can take up to 10,000,000/1,000 = 10,000 times as long while still consuming that memory, so not sure we do get any quick wins here which apply to all queries. I think as soon as queries get that difficult, we should rather encourage users to simplify their queries for now, until we can find a way where we maybe don't even need to query the matches but can only query the repos with matches(?) It's fairly easy to implement pagination, but I didn't get any promising results out of it yet. We could try to dig deeper here, but I'm not sure what wonders to expect, especially given that customers would need to learn to use the type:file filter where possible (but not when impossible) 🤔

mrnugget

comment created time in 10 days

issue commentsourcegraph/sourcegraph

Campaigns: 3.18 Tracking Issue

Weekly update: 2020-06-22 to 2020-06-28

Last week

besides a couple of interviews, I finished yaml support in src cli, worked on frontend testing toward the frontend testing OKR by adding better serializers for enzyme and migrating the campaigns codebase to utilize those in this PR and cleaned-up and tested on edge-cases with the removal of the manual campaigns distinction here. I also started working on a prototype using search pagination in src-cli, and wrote my writeup of that today.

This week

I am going to work on enabling collecting changesets for unsupported codehosts and will look into the new GraphQL API and UI, and see what tasks we can derive from it, so I can make my placeholder issue #10986 more precise.

mrnugget

comment created time in 10 days

issue commentsourcegraph/sourcegraph

Use paginated search in src-cli to fetch repositories for campaigns

Does your verdict change when taking these two constraints into account?

The complete list constraint is only not ensured right now when there are more than the hard-coded 999999 results. In that case, it would work, but would basically take forever to paginate to that result (unless we set the first value to something really high). And the problem with the search timing out is still not ensured to not happen, because it still needs to compute much more than just a 100 results.

Right now, or in general?

Right now, not sure if it would be impossible, but it isn't implemented at this time.

Why is it brittle?

First, we don't encourage customers to not use it for production, but more importantly it is hard to choose the right value for first, and might cause a lot of duplicated computations over the page requests, which actually puts the pressure on the backend for longer. https://docs.sourcegraph.com/api/graphql/search#choosing-the-right-per-page-value. So even if we find a good value for testing, it might be a very bad value for some other queries.

mrnugget

comment created time in 10 days

PR opened sourcegraph/sourcegraph

Implement support to preview unsupported patches campaigns

Closes #11193

+380 -32

0 comment

18 changed files

pr created time in 10 days

create barnchsourcegraph/sourcegraph

branch : es/preview-unsupported

created branch time in 10 days

create barnchsourcegraph/src-cli

branch : es/search-pagination

created branch time in 10 days

more