profile
viewpoint

timbod7/haskell-chart 359

A 2D charting library for haskell

timbod7/adl 164

ADL (Algebraic Data Language)

timbod7/secretfs 18

A FUSE file system that interpolates secret information

timbod7/flux-model 5

An executable specification for the flux voting system

timbod7/adl-demo 2

Demonstration ADL projects

timbod7/letsencrypt-aws 2

Manager for letsencrypt's certbot when running on AWS

timbod7/hbeat 1

A toy step sequencer in haskell (old)

push eventhelix-collective/gizmo

Jaie Wilson

commit sha 2bfe2b5690908f00827e2f32007bd1e4048c8346

updated the sdk to allow an custom async and namespace

view details

push time in a day

push eventhelix-collective/helix-adl-tools

Vyoma Patel

commit sha 593e01eeba4c756cb6cebe9482add5dd2096b5c8

automated removing non-empty dev cache directory for linux

view details

Vyoma Patel

commit sha 84a2a11f6e24fe3502ce08894e900ff20d036b6e

resolved primary key alter query gen

view details

push time in 2 days

push eventhelix-collective/helix-adl-tools

Vyoma Patel

commit sha 979f59a4b9395a96b6682151a49ecadd87cb6c6a

changed command name

view details

push time in 2 days

push eventhelix-collective/helix-adl-tools

Vyoma Patel

commit sha 48362704c9404f6232308291b449d51783ab21f5

refactored to generate up sql queries

view details

push time in 2 days

push eventhelix-collective/helix-adl-tools

Vyoma Patel

commit sha a6e693de4b8c289f8f75eefe2b446b3a9a81fe2e

updated the command script

view details

Vyoma Patel

commit sha 0faff82a6460062fdaac700219250a5ec75b95cf

Merge branch 'new-option-sql' of github.com:helix-collective/helix-adl-tools into new-option-sql

view details

push time in 3 days

Pull request review commenthelix-collective/hx-terraform

Proposal for Breaking changelog

+# Changelog of breaking changes and their remediation where applicable.++## Format Template+### {Change/PR/Commit title} {Date}+[link to Change/PR/Commit]()

Can the template please omit the requirement to link to the PR and commit? When one is making changes it's impossible to know which commit it will become and impractical to know which PR it will become.

The actual PR and commit can be found by git blame anyhow.

ashvds

comment created time in 3 days

Pull request review commenthelix-collective/hx-terraform

Proposal for Breaking changelog

+# Changelog of breaking changes and their remediation where applicable.++## Format Template+### {Change/PR/Commit title} {Date}
### One line description of the change.
ashvds

comment created time in 3 days

delete branch helix-collective/hx-terraform

delete branch : ashvds/changelog

delete time in 3 days

push eventhelix-collective/hx-terraform

Ash van der Spuy

commit sha 722c7eb92dd15d67cfa2ebcd9d6af081a9194436

Proposal for Breaking changelog (#5) * Updated doco to detail where breaking change info can be found based on Helix agreed standards

view details

push time in 3 days

PR merged helix-collective/hx-terraform

Reviewers
Proposal for Breaking changelog

After struggling with hxterraform over the weekend, it was suggested I update the project to latest, this introduced breaking changes, and I didn't know that I had to find said changes in a commit message and slack thread.

I believe having this documented alongside the code makes it easy to find without having to search through commits. Used something similar at a previous job and it worked well.

+8 -0

1 comment

1 changed file

ashvds

pr closed time in 3 days

push eventhelix-collective/hx-terraform

ashvds

commit sha 8d8cab0d4e338da2ba4b741123ea30dd23c58aad

Updated doco based on Helix agreed standards

view details

push time in 3 days

pull request commenthelix-collective/hx-terraform

Proposal for Breaking changelog

  • I didn't know I could find instructions on fixing breaking changes in slack and commit messages, nor did I know which commit message to look for
ashvds

comment created time in 3 days

PR opened helix-collective/hx-terraform

Reviewers
Proposal for Breaking changelog

After struggling with hxterraform over the weekend, it was suggested I update the project to latest, this introduced breaking changes, and I didn't know that I had to find said changes in a commit message and slack thread.

I believe having this documented alongside the code makes it easy to find without having to search through commits. Used something similar at a previous job and it worked well.

+22 -0

0 comment

2 changed files

pr created time in 3 days

create barnchhelix-collective/hx-terraform

branch : ashvds/changelog

created branch time in 3 days

push eventhelix-collective/helix-adl-tools

Gary Miller

commit sha c1aa30f1f1107b70c9edeac8a63cb60840610053

migration as it's own command

view details

push time in 3 days

Pull request review commenthelix-collective/helix-adl-tools

generate sql for migrations using new option for genadl

+import * as adlast from './adl-gen/sys/adlast';+import * as adl from "./adl-gen/runtime/adl";+import { createJsonBinding } from "./adl-gen/runtime/json";+import { collect, scopedName, scopedNamesEqual, expandTypes, expandNewType, expandTypeAlias, parseAdl, forEachDecl, getAnnotation, decodeTypeExpr, LoadedAdl } from "./util";+import * as fs from "fs";+import * as mustache from "mustache";+import { isEnum, typeExprToStringUnscoped } from './adl-gen/runtime/utils';+import { Command } from "commander";+import { snakeCase } from "change-case";++export function configureCli(program: Command) {+  program+   .command("sql [adlFiles...]")+   .option('-I, --searchdir <path>', 'Add to adl searchpath', collect, [])+   .option('--outfile <path>', 'the resulting sql file', 'create.sql')+   .option('--outputdir <dir>', 'the directory into which the sql is written (deprecated)')+   .option('--outmetadata <path>', 'sql to insert the model metadata')+   .option('--outtemplatesql <paths>', 'generate extra sql from a mustache template', collect, [])+   .option('--postgres', 'Generate sql for postgres')+   .option('--postgres-v2', 'Generate sql for postgres (model version 2)')+   .option('--mssql', 'Generate sql for microsoft sqlserver')+   .option('--extension <ext>', 'Add to included sql extensions', collect, [])+   .option('--migration', 'Generate sql for migration')+   .description('Generate a db schema from ADL files')+   .action( (adlFiles:string[], cmd:{}) => {+     const adlSearchPath: string[] = cmd['searchdir'];+     const extensions: string[] = cmd['extension'];+     const templates: Template[] = parseTemplates(cmd['outtemplatesql'] || []);++     let outfile: string = cmd['outfile'];+     if (cmd['outputdir']) {+       outfile = cmd['outputdir'] + '/create.sql';+     }++     let outmetadata: string | null = cmd['outmetadata'] || null;++     let dbProfile = postgresDbProfile;+     if (cmd['postgresV2']) {+       dbProfile = postgres2DbProfile;+     }+     if (cmd['mssql']) {+       dbProfile = mssql2DbProfile;+     }++     generateSql({adlFiles, adlSearchPath, outfile, outmetadata, extensions, templates, dbProfile});+   });+}++export interface Params {+  adlFiles: string[];+  adlSearchPath: string[];+  outfile: string;+  outmetadata: string | null;+  extensions: string[];+  templates: Template[];+  dbProfile: DbProfile;+};++interface Template {+  template: string;+  outfile: string;+};++export interface DbTable {+  scopedDecl: adlast.ScopedDecl;+  struct: adlast.DeclType_Struct_;+  ann: {}|null;+  name: string;+};++export async function generateSql(params: Params): Promise<void> {+  // Load the ADL based upon command line arguments+  const loadedAdl = await parseAdl(params.adlFiles, params.adlSearchPath);++  // Find all of the struct declarations that have a DbTable annotation+  const dbTables: DbTable[]  = [];++  forEachDecl(loadedAdl.modules, scopedDecl => {+    if (scopedDecl.decl.type_.kind == 'struct_') {+      const struct = scopedDecl.decl.type_;+      const ann = getAnnotation(scopedDecl.decl.annotations, DB_TABLE);+      if (ann != undefined) {+        const name = getTableName(scopedDecl);+        dbTables.push({scopedDecl, struct, ann, name});+      }+    }+  });+  dbTables.sort( (t1, t2) => t1.name < t2.name ? -1 : t1.name > t2.name ? 1 : 0);+  await generateSqlSchema(params, loadedAdl, dbTables);+  if (params.outmetadata !== null) {+    await generateMetadata(params.outmetadata, params, loadedAdl, dbTables);+  }+  for(const t of params.templates) {+    await generateTemplate(t, dbTables);+  }+}++async function generateSqlSchema(params: Params, loadedAdl: LoadedAdl, dbTables: DbTable[]): Promise<void> {+  // Now generate the SQL file+  const writer = fs.createWriteStream(params.outfile);+  const moduleNames : Set<string> = new Set(dbTables.map(dbt => dbt.scopedDecl.moduleName));+  writer.write( `-- Schema auto-generated from adl modules: ${Array.from(moduleNames.keys()).join(', ')}\n` );+  writer.write( `--\n` );++  if (params.extensions.length > 0) {+    writer.write('\n');+    params.extensions.forEach( e => {+      writer.write( `create extension ${e};\n` );+    });+  }++  const constraints: string[] = [];+  let allExtraSql: string[] = [];++  // Output the tables+  for(const t of dbTables) {+    const withIdPrimaryKey: boolean  = t.ann && t.ann['withIdPrimaryKey'] || false;+    const withPrimaryKey: string[] = t.ann && t.ann['withPrimaryKey'] || [];+    const indexes: string[][] = t.ann && t.ann['indexes'] || [];+    const uniquenessConstraints: string[][] = t.ann && t.ann['uniquenessConstraints'] || [];+    const extraSql: string[] = t.ann && t.ann['extraSql'] || [];++    const lines: {code:string, comment?:string}[] = [];+    if (withIdPrimaryKey) {+      lines.push({code: `id ${params.dbProfile.idColumnType} not null`});+    }+    for(const f of t.struct.value.fields) {+      const columnName = getColumnName(f);+      const columnType = getColumnType(loadedAdl.resolver, f, params.dbProfile);+      lines.push({+        code: `${columnName} ${columnType.sqltype}`,+        comment: typeExprToStringUnscoped(f.typeExpr),+      });+      if (columnType.fkey) {+        constraints.push(`alter table ${quoteReservedName(t.name)} add constraint ${t.name}_${columnName}_fk foreign key (${columnName}) references ${quoteReservedName(columnType.fkey.table)}(${columnType.fkey.column});`);+      }+    }++    function findColName(s: string):string {+      for(const f of t.struct.value.fields) {+        if (f.name == s) {+          return getColumnName(f);+        }+      }+      return s;+    }++    for(let i = 0; i < indexes.length; i++) {+      const cols = indexes[i].map(findColName);+      constraints.push(`create index ${t.name}_${i+1}_idx on ${quoteReservedName(t.name)}(${cols.join(', ')});`);+    }+    for(let i = 0; i < uniquenessConstraints.length; i++) {+      const cols = uniquenessConstraints[i].map(findColName);+      constraints.push(`alter table ${quoteReservedName(t.name)} add constraint ${t.name}_${i+1}_con unique (${cols.join(', ')});`);+    }+    if (withIdPrimaryKey) {+      lines.push({code:'primary key(id)'});+    } else if (withPrimaryKey.length > 0) {+      const cols = withPrimaryKey.map(findColName);+      lines.push({code:`primary key(${cols.join(',')})`});+    }+++    writer.write("begin;");

@gmhta how do I test/run this script?

vyyyy

comment created time in 6 days

Pull request review commenthelix-collective/helix-adl-tools

generate sql for migrations using new option for genadl

+import * as adlast from './adl-gen/sys/adlast';+import * as adl from "./adl-gen/runtime/adl";+import { createJsonBinding } from "./adl-gen/runtime/json";+import { collect, scopedName, scopedNamesEqual, expandTypes, expandNewType, expandTypeAlias, parseAdl, forEachDecl, getAnnotation, decodeTypeExpr, LoadedAdl } from "./util";+import * as fs from "fs";+import * as mustache from "mustache";+import { isEnum, typeExprToStringUnscoped } from './adl-gen/runtime/utils';+import { Command } from "commander";+import { snakeCase } from "change-case";++export function configureCli(program: Command) {+  program+   .command("sql [adlFiles...]")+   .option('-I, --searchdir <path>', 'Add to adl searchpath', collect, [])+   .option('--outfile <path>', 'the resulting sql file', 'create.sql')+   .option('--outputdir <dir>', 'the directory into which the sql is written (deprecated)')+   .option('--outmetadata <path>', 'sql to insert the model metadata')+   .option('--outtemplatesql <paths>', 'generate extra sql from a mustache template', collect, [])+   .option('--postgres', 'Generate sql for postgres')+   .option('--postgres-v2', 'Generate sql for postgres (model version 2)')+   .option('--mssql', 'Generate sql for microsoft sqlserver')+   .option('--extension <ext>', 'Add to included sql extensions', collect, [])+   .option('--migration', 'Generate sql for migration')

@gmhta using --migration

vyyyy

comment created time in 6 days

push eventhelix-collective/helix-adl-tools

Vyoma Patel

commit sha 18db728d324d32520f7c5d348d6be428c24e426a

used alter table to add columns so sql queries run one by one

view details

push time in 6 days

Pull request review commenthelix-collective/helix-adl-tools

generate sql for migrations using new option for genadl

 export function configureCli(program: Command) {    .option('--postgres-v2', 'Generate sql for postgres (model version 2)')    .option('--mssql', 'Generate sql for microsoft sqlserver')    .option('--extension <ext>', 'Add to included sql extensions', collect, [])+   .option('--sqlmigration', 'Generate sql for migration')

@gmhta wip rn, please suggest what the option should be named.

vyyyy

comment created time in 6 days

create barnchhelix-collective/helix-adl-tools

branch : new-option-sql

created branch time in 6 days

release helix-collective/helix-adl-tools

v0.38.2

released time in 8 days

startedtimbod7/haskell-chart

started time in 9 days

push eventhelix-collective/hx-terraform

paul-thompson-helix

commit sha c8aea762fe2f9259288ab0a73e7275d49711ee97

Update docs/ with instructions and documentation re dnit (#4) Update docs/ with instructions and documentation re dnit

view details

push time in 10 days

delete branch helix-collective/hx-terraform

delete branch : dnit-post-apply-clear-plan

delete time in 10 days

push eventhelix-collective/hx-terraform

paul-thompson-helix

commit sha 8258a68abd01333cf90735a6dbfa55730d58cd08

fix: Ensure terraform plan is removed after apply (#3) fix: Remove terraform plan after any error using in apply.

view details

push time in 10 days

create barnchhelix-collective/hx-terraform

branch : dnit-docs

created branch time in 11 days

more