344 lines
7.8 KiB
Markdown
344 lines
7.8 KiB
Markdown
|
---
|
||
|
id: start
|
||
|
title: Getting Started
|
||
|
sidebar_label: Getting Started
|
||
|
---
|
||
|
|
||
|
Super Graph can generate your initial app for you. The generated app will have config files, database migrations and seed files among other things like docker related files.
|
||
|
|
||
|
You can then add your database schema to the migrations, maybe create some seed data using the seed script and launch Super Graph. You're now good to go and can start working on your UI frontend in React, Vue or whatever.
|
||
|
|
||
|
```bash
|
||
|
# Download and install Super Graph. You will need Go 1.14 or above
|
||
|
go get https://github.com/dosco/super-graph
|
||
|
```
|
||
|
|
||
|
And then create and launch your new app
|
||
|
|
||
|
```bash
|
||
|
# Create a new app and change to it's directory
|
||
|
super-graph new blog
|
||
|
|
||
|
cd blog
|
||
|
|
||
|
# Setup the app database and seed it with fake data.
|
||
|
# Docker compose will start a Postgres database for your app
|
||
|
docker-compose run blog_api ./super-graph db:setup
|
||
|
|
||
|
# Finally launch Super Graph configured for your app
|
||
|
docker-compose up
|
||
|
```
|
||
|
|
||
|
Lets take a look at the files generated by Super Graph when you create a new app
|
||
|
|
||
|
```bash
|
||
|
super-graph new blog
|
||
|
|
||
|
> created 'blog'
|
||
|
> created 'blog/Dockerfile'
|
||
|
> created 'blog/docker-compose.yml'
|
||
|
> created 'blog/config'
|
||
|
> created 'blog/config/dev.yml'
|
||
|
> created 'blog/config/prod.yml'
|
||
|
> created 'blog/config/seed.js'
|
||
|
> created 'blog/config/migrations'
|
||
|
> created 'blog/config/migrations/100_init.sql'
|
||
|
> app 'blog' initialized
|
||
|
```
|
||
|
|
||
|
:::note Docker
|
||
|
Docker Compose is a great way to run multiple services while developing on your desktop or laptop. In our case we need Postgres and Super Graph to both be running and the `docker-compose.yml` is configured to do just that. The Super Graph service is named after your app postfixed with `_api`. The Dockerfile can be used build a containr of your app for production deployment.
|
||
|
:::
|
||
|
|
||
|
Run Super Graph with Docker compose
|
||
|
|
||
|
```bash
|
||
|
docker-compose run blog_api ./super-graph help
|
||
|
```
|
||
|
|
||
|
### Config files
|
||
|
|
||
|
All the config files needs to configure Super Graph for your app are contained in this folder for starters you have two `dev.yaml` and `prod.yaml`. When the `GO_ENV` environment variable is set to `development` then `dev.yaml` is used and the prod one when it's set to `production`. Stage and Test are the other two environment options, but you can set the `GO_ENV` to whatever you like (eg. `alpha-test`) and Super Graph will look for a yaml file with that name to load config from.
|
||
|
|
||
|
### Seed.js
|
||
|
|
||
|
Having data flowing through your API makes building your frontend UI so much easier. When creafting say a user profile wouldn't it be nice for the API to return a fake user with name, picture and all. This is why having the ability to seed your database is important. Seeding cn also be used in production to setup some initial users like the admins or to add an initial set of products to a ecommerce store.
|
||
|
|
||
|
Super Graph makes this easy by allowing you to write your seeding script in plan old Javascript. The below file that auto-generated for new apps uses our built-in functions `fake` and `graphql` to generate fake data and use GraphQL mutations to insert it into the database.
|
||
|
|
||
|
```javascript
|
||
|
// Example script to seed database
|
||
|
|
||
|
var users = [];
|
||
|
|
||
|
for (i = 0; i < 10; i++) {
|
||
|
var data = {
|
||
|
full_name: fake.name(),
|
||
|
email: fake.email(),
|
||
|
};
|
||
|
|
||
|
var res = graphql(" \
|
||
|
mutation { \
|
||
|
user(insert: $data) { \
|
||
|
id \
|
||
|
} \
|
||
|
}", { data: data });
|
||
|
|
||
|
users.push(res.user);
|
||
|
}
|
||
|
```
|
||
|
|
||
|
If you want to import a lot of data using a CSV file is the best and fastest option. The `import_csv` command uses the `COPY FROM` Postgres method to load massive amounts of data into tables. The first line of the CSV file must be the header with column names.
|
||
|
|
||
|
```javascript
|
||
|
var post_count = import_csv("posts", "posts.csv");
|
||
|
```
|
||
|
|
||
|
You can generate the following fake data for your seeding purposes. Below is the list of fake data functions supported by the built-in fake data library. For example `fake.image_url()` will generate a fake image url or `fake.shuffle_strings(['hello', 'world', 'cool'])` will generate a randomly shuffled version of that array of strings or `fake.rand_string(['hello', 'world', 'cool'])` will return a random string from the array provided.
|
||
|
|
||
|
```
|
||
|
// Person
|
||
|
person
|
||
|
name
|
||
|
name_prefix
|
||
|
name_suffix
|
||
|
first_name
|
||
|
last_name
|
||
|
gender
|
||
|
ssn
|
||
|
contact
|
||
|
email
|
||
|
phone
|
||
|
phone_formatted
|
||
|
username
|
||
|
password
|
||
|
|
||
|
// Address
|
||
|
address
|
||
|
city
|
||
|
country
|
||
|
country_abr
|
||
|
state
|
||
|
state_abr
|
||
|
status_code
|
||
|
street
|
||
|
street_name
|
||
|
street_number
|
||
|
street_prefix
|
||
|
street_suffix
|
||
|
zip
|
||
|
latitude
|
||
|
latitude_in_range
|
||
|
longitude
|
||
|
longitude_in_range
|
||
|
|
||
|
// Beer
|
||
|
beer_alcohol
|
||
|
beer_hop
|
||
|
beer_ibu
|
||
|
beer_blg
|
||
|
beer_malt
|
||
|
beer_name
|
||
|
beer_style
|
||
|
beer_yeast
|
||
|
|
||
|
// Cars
|
||
|
car
|
||
|
car_type
|
||
|
car_maker
|
||
|
car_model
|
||
|
|
||
|
// Text
|
||
|
word
|
||
|
sentence
|
||
|
paragraph
|
||
|
question
|
||
|
quote
|
||
|
|
||
|
// Misc
|
||
|
generate
|
||
|
boolean
|
||
|
uuid
|
||
|
|
||
|
// Colors
|
||
|
color
|
||
|
hex_color
|
||
|
rgb_color
|
||
|
safe_color
|
||
|
|
||
|
// Internet
|
||
|
url
|
||
|
image_url
|
||
|
domain_name
|
||
|
domain_suffix
|
||
|
ipv4_address
|
||
|
ipv6_address
|
||
|
simple_status_code
|
||
|
http_method
|
||
|
user_agent
|
||
|
user_agent_firefox
|
||
|
user_agent_chrome
|
||
|
user_agent_opera
|
||
|
user_agent_safari
|
||
|
|
||
|
// Date / Time
|
||
|
date
|
||
|
date_range
|
||
|
nano_second
|
||
|
second
|
||
|
minute
|
||
|
hour
|
||
|
month
|
||
|
day
|
||
|
weekday
|
||
|
year
|
||
|
timezone
|
||
|
timezone_abv
|
||
|
timezone_full
|
||
|
timezone_offset
|
||
|
|
||
|
// Payment
|
||
|
price
|
||
|
credit_card
|
||
|
credit_card_cvv
|
||
|
credit_card_number
|
||
|
credit_card_number_luhn
|
||
|
credit_card_type
|
||
|
currency
|
||
|
currency_long
|
||
|
currency_short
|
||
|
|
||
|
// Company
|
||
|
bs
|
||
|
buzzword
|
||
|
company
|
||
|
company_suffix
|
||
|
job
|
||
|
job_description
|
||
|
job_level
|
||
|
job_title
|
||
|
|
||
|
// Hacker
|
||
|
hacker_abbreviation
|
||
|
hacker_adjective
|
||
|
hacker_ingverb
|
||
|
hacker_noun
|
||
|
hacker_phrase
|
||
|
hacker_verb
|
||
|
|
||
|
//Hipster
|
||
|
hipster_word
|
||
|
hipster_paragraph
|
||
|
hipster_sentence
|
||
|
|
||
|
// File
|
||
|
file_extension
|
||
|
file_mine_type
|
||
|
|
||
|
// Numbers
|
||
|
number
|
||
|
numerify
|
||
|
int8
|
||
|
int16
|
||
|
int32
|
||
|
int64
|
||
|
uint8
|
||
|
uint16
|
||
|
uint32
|
||
|
uint64
|
||
|
float32
|
||
|
float32_range
|
||
|
float64
|
||
|
float64_range
|
||
|
shuffle_ints
|
||
|
mac_address
|
||
|
|
||
|
//String
|
||
|
digit
|
||
|
letter
|
||
|
lexify
|
||
|
shuffle_strings
|
||
|
numerify
|
||
|
```
|
||
|
|
||
|
Other utility functions
|
||
|
|
||
|
```
|
||
|
shuffle_strings(string_array)
|
||
|
make_slug(text)
|
||
|
make_slug_lang(text, lang)
|
||
|
```
|
||
|
|
||
|
### Migrations
|
||
|
|
||
|
Easy database migrations is the most important thing when building products backend by a relational database. We make it super easy to manage and migrate your database.
|
||
|
|
||
|
```bash
|
||
|
super-graph db:new create_users
|
||
|
> created migration 'config/migrations/101_create_users.sql'
|
||
|
```
|
||
|
|
||
|
Migrations in Super Graph are plain old Postgres SQL. Here's an example for the above migration.
|
||
|
|
||
|
```sql
|
||
|
-- Write your migrate up statements here
|
||
|
|
||
|
CREATE TABLE public.users (
|
||
|
id bigint GENERATED ALWAYS AS IDENTITY PRIMARY KEY,
|
||
|
full_name text,
|
||
|
email text UNIQUE NOT NULL CHECK (length(email) < 255),
|
||
|
created_at timestamptz NOT NULL NOT NULL DEFAULT NOW(),
|
||
|
updated_at timestamptz NOT NULL NOT NULL DEFAULT NOW()
|
||
|
);
|
||
|
|
||
|
---- create above / drop below ----
|
||
|
|
||
|
-- Write your down migrate statements here. If this migration is irreversible
|
||
|
-- then delete the separator line above.
|
||
|
|
||
|
DROP TABLE public.users
|
||
|
```
|
||
|
|
||
|
We would encourage you to leverage triggers to maintain consistancy of your data for example here are a couple triggers that you can add to you init migration and across your tables.
|
||
|
|
||
|
```sql
|
||
|
-- This trigger script will set the updated_at column everytime a row is updated
|
||
|
CREATE OR REPLACE FUNCTION trigger_set_updated_at()
|
||
|
RETURNS TRIGGER SET SCHEMA 'public' LANGUAGE 'plpgsql' AS $$
|
||
|
BEGIN
|
||
|
new.updated_at = now();
|
||
|
RETURN new;
|
||
|
END;
|
||
|
$$;
|
||
|
|
||
|
...
|
||
|
|
||
|
-- An exmple of adding this trigger to the users table
|
||
|
CREATE TRIGGER set_updated_at BEFORE UPDATE ON public.users
|
||
|
FOR EACH ROW EXECUTE PROCEDURE trigger_set_updated_at();
|
||
|
```
|
||
|
|
||
|
```sql
|
||
|
-- This trigger script will set the user_id column to the current
|
||
|
-- Super Graph user.id value everytime a row is created or updated
|
||
|
CREATE OR REPLACE FUNCTION trigger_set_user_id()
|
||
|
RETURNS TRIGGER SET SCHEMA 'public' LANGUAGE 'plpgsql' AS $$
|
||
|
BEGIN
|
||
|
IF TG_OP = 'UPDATE' THEN
|
||
|
new.user_id = old.user_id;
|
||
|
ELSE
|
||
|
new.user_id = current_setting('user.id')::int;
|
||
|
END IF;
|
||
|
|
||
|
RETURN new;
|
||
|
END;
|
||
|
$$;
|
||
|
|
||
|
...
|
||
|
|
||
|
-- An exmple of adding this trigger to the blog_posts table
|
||
|
CREATE TRIGGER set_user_id BEFORE INSERT OR UPDATE ON public.blog_posts
|
||
|
FOR EACH ROW EXECUTE PROCEDURE trigger_set_user_id();
|
||
|
|
||
|
```
|