Compare commits

..

10 Commits

Author SHA1 Message Date
3wc e1f2318273 Merge branch 'tests' into publicapi-tests 2021-07-17 13:37:43 +02:00
3wc 7b9e9debab Disable VM creation check for the moment 2021-07-17 13:37:28 +02:00
3wc bef379273f Fix botched merge 2021-07-17 13:32:54 +02:00
3wc 82c0c63ff4 Add tests for public API 2021-07-17 13:29:16 +02:00
3wc d5e0e069d1 Merge branch 'tests' into publicapi-tests 2021-07-17 13:28:50 +02:00
3wc b8d149e862 Updates for upstream IP handling 2021-07-17 13:23:43 +02:00
3wc cf42ac5e4d Add basic "create" API..
.. using server-side API tokens
2021-07-17 12:18:16 +02:00
3wc 9823bda4a2 Add SSH key tests 2021-07-17 12:10:28 +02:00
3wc 4c7caa1a38 Initial console tests
NB capsul create isn't working properly, see #83
2021-07-17 11:54:15 +02:00
3wc 665e087bd4 Basic testing using flask-testing
This commit makes it possible to override settings during tests, by
switching capsulflask/__init__.py to a "create_app" pattern, and using
`dotenv_values` instead of `load_dotenv`.

The create_app() method returns a Flask app instance, to give
more control over when to initialise the app. This allows setting
environment variables in test files.

Then, use dotenv_values to override loaded .env variables with ones from
the environment, so that tests can set `POSTGRES_CONNECTION_PARAMETERS`
and `SPOKE_MODEL` (possibly others in future..).

Inital tests for the "landing" pages, and login / activation, are
included.
2021-07-17 10:38:04 +02:00
69 changed files with 1127 additions and 1938 deletions

View File

@ -1,13 +0,0 @@
---
kind: pipeline
name: publish docker image
steps:
- name: build and publish
image: plugins/docker
settings:
username:
from_secret: docker_reg_username_3wc
password:
from_secret: docker_reg_passwd_3wc
repo: 3wordchant/capsul-flask
tags: ${DRONE_COMMIT_BRANCH}

View File

@ -1,11 +0,0 @@
# Optional, default `mock`
#SPOKE_MODEL=shell-scripts
# Optional, default `0`
#FLASK_DEBUG=0
# Optional, default `http://localhost:5000`
#BASE_URL=http://localhost:5000
# Optional, default `qemu:///system` if you're root, otherwise `qemu:///session`
#VIRSH_DEFAULT_CONNECT_URI=qemu:///system
#ADMIN_PANEL_ALLOW_EMAIL_ADDRESSES=3wc.capsul@doesthisthing.work
# Optional, default no theme
#THEME=yolocolo

3
.gitignore vendored
View File

@ -1,6 +1,5 @@
notes.txt
.env
.env.bak
.vscode
*.pyc
@ -11,11 +10,9 @@ instance/
.pytest_cache/
.coverage
htmlcov/
/unittest-log-output.log
dist/
build/
*.egg-info/
.venv
unittest-log-output.log

View File

@ -1,48 +0,0 @@
FROM python:3.8-alpine as build
RUN apk add --no-cache \
build-base \
gcc \
gettext \
git \
jpeg-dev \
libffi-dev \
libjpeg \
musl-dev \
postgresql-dev \
python3-dev \
zlib-dev
RUN mkdir -p /app/{code,venv}
WORKDIR /app/code
COPY Pipfile Pipfile.lock /app/code/
RUN python3 -m venv /app/venv
RUN pip install pipenv setuptools
ENV PATH="/app/venv/bin:$PATH" VIRTUAL_ENV="/app/venv"
RUN pip install wheel cppy
# Install dependencies into the virtual environment with Pipenv
RUN pipenv install --deploy --verbose
FROM python:3.8-alpine
RUN apk add --no-cache \
cloud-utils \
libjpeg \
libpq \
libstdc++ \
libvirt-client \
openssh-client \
virt-install
COPY . /app/code/
WORKDIR /app/code
COPY --from=build /app/venv /app/venv
ENV PATH="/app/venv/bin:$PATH" VIRTUAL_ENV="/app/venv"
CMD ["gunicorn", "--bind", "0.0.0.0:5000", "-k", "gevent", "--worker-connections", "1000", "app:app"]
VOLUME /app/code
EXPOSE 5000

534
Pipfile.lock generated
View File

@ -1,7 +1,7 @@
{
"_meta": {
"hash": {
"sha256": "b1ac4a00fca97c0174ff7c0637fc1b9c0c2264b805bf51a2281016397e2a319e"
"sha256": "12572a60fdc1c3246935e0ce5c4fee31cf89e9dc01a03d0fab87ce2b0633fc89"
},
"pipfile-spec": 6,
"requires": {
@ -18,46 +18,46 @@
"default": {
"aiohttp": {
"hashes": [
"sha256:02f46fc0e3c5ac58b80d4d56eb0a7c7d97fcef69ace9326289fb9f1955e65cfe",
"sha256:0563c1b3826945eecd62186f3f5c7d31abb7391fedc893b7e2b26303b5a9f3fe",
"sha256:114b281e4d68302a324dd33abb04778e8557d88947875cbf4e842c2c01a030c5",
"sha256:14762875b22d0055f05d12abc7f7d61d5fd4fe4642ce1a249abdf8c700bf1fd8",
"sha256:15492a6368d985b76a2a5fdd2166cddfea5d24e69eefed4630cbaae5c81d89bd",
"sha256:17c073de315745a1510393a96e680d20af8e67e324f70b42accbd4cb3315c9fb",
"sha256:209b4a8ee987eccc91e2bd3ac36adee0e53a5970b8ac52c273f7f8fd4872c94c",
"sha256:230a8f7e24298dea47659251abc0fd8b3c4e38a664c59d4b89cca7f6c09c9e87",
"sha256:2e19413bf84934d651344783c9f5e22dee452e251cfd220ebadbed2d9931dbf0",
"sha256:393f389841e8f2dfc86f774ad22f00923fdee66d238af89b70ea314c4aefd290",
"sha256:3cf75f7cdc2397ed4442594b935a11ed5569961333d49b7539ea741be2cc79d5",
"sha256:3d78619672183be860b96ed96f533046ec97ca067fd46ac1f6a09cd9b7484287",
"sha256:40eced07f07a9e60e825554a31f923e8d3997cfc7fb31dbc1328c70826e04cde",
"sha256:493d3299ebe5f5a7c66b9819eacdcfbbaaf1a8e84911ddffcdc48888497afecf",
"sha256:4b302b45040890cea949ad092479e01ba25911a15e648429c7c5aae9650c67a8",
"sha256:515dfef7f869a0feb2afee66b957cc7bbe9ad0cdee45aec7fdc623f4ecd4fb16",
"sha256:547da6cacac20666422d4882cfcd51298d45f7ccb60a04ec27424d2f36ba3eaf",
"sha256:5df68496d19f849921f05f14f31bd6ef53ad4b00245da3195048c69934521809",
"sha256:64322071e046020e8797117b3658b9c2f80e3267daec409b350b6a7a05041213",
"sha256:7615dab56bb07bff74bc865307aeb89a8bfd9941d2ef9d817b9436da3a0ea54f",
"sha256:79ebfc238612123a713a457d92afb4096e2148be17df6c50fb9bf7a81c2f8013",
"sha256:7b18b97cf8ee5452fa5f4e3af95d01d84d86d32c5e2bfa260cf041749d66360b",
"sha256:932bb1ea39a54e9ea27fc9232163059a0b8855256f4052e776357ad9add6f1c9",
"sha256:a00bb73540af068ca7390e636c01cbc4f644961896fa9363154ff43fd37af2f5",
"sha256:a5ca29ee66f8343ed336816c553e82d6cade48a3ad702b9ffa6125d187e2dedb",
"sha256:af9aa9ef5ba1fd5b8c948bb11f44891968ab30356d65fd0cc6707d989cd521df",
"sha256:bb437315738aa441251214dad17428cafda9cdc9729499f1d6001748e1d432f4",
"sha256:bdb230b4943891321e06fc7def63c7aace16095be7d9cf3b1e01be2f10fba439",
"sha256:c6e9dcb4cb338d91a73f178d866d051efe7c62a7166653a91e7d9fb18274058f",
"sha256:cffe3ab27871bc3ea47df5d8f7013945712c46a3cc5a95b6bee15887f1675c22",
"sha256:d012ad7911653a906425d8473a1465caa9f8dea7fcf07b6d870397b774ea7c0f",
"sha256:d9e13b33afd39ddeb377eff2c1c4f00544e191e1d1dee5b6c51ddee8ea6f0cf5",
"sha256:e4b2b334e68b18ac9817d828ba44d8fcb391f6acb398bcc5062b14b2cbeac970",
"sha256:e54962802d4b8b18b6207d4a927032826af39395a3bd9196a5af43fc4e60b009",
"sha256:f705e12750171c0ab4ef2a3c76b9a4024a62c4103e3a55dd6f99265b9bc6fcfc",
"sha256:f881853d2643a29e643609da57b96d5f9c9b93f62429dcc1cbb413c7d07f0e1a",
"sha256:fe60131d21b31fd1a14bd43e6bb88256f69dfc3188b3a89d736d6c71ed43ec95"
"sha256:0b795072bb1bf87b8620120a6373a3c61bfcb8da7e5c2377f4bb23ff4f0b62c9",
"sha256:0d438c8ca703b1b714e82ed5b7a4412c82577040dadff479c08405e2a715564f",
"sha256:16a3cb5df5c56f696234ea9e65e227d1ebe9c18aa774d36ff42f532139066a5f",
"sha256:1edfd82a98c5161497bbb111b2b70c0813102ad7e0aa81cbeb34e64c93863005",
"sha256:2406dc1dda01c7f6060ab586e4601f18affb7a6b965c50a8c90ff07569cf782a",
"sha256:2858b2504c8697beb9357be01dc47ef86438cc1cb36ecb6991796d19475faa3e",
"sha256:2a7b7640167ab536c3cb90cfc3977c7094f1c5890d7eeede8b273c175c3910fd",
"sha256:3228b7a51e3ed533f5472f54f70fd0b0a64c48dc1649a0f0e809bec312934d7a",
"sha256:328b552513d4f95b0a2eea4c8573e112866107227661834652a8984766aa7656",
"sha256:39f4b0a6ae22a1c567cb0630c30dd082481f95c13ca528dc501a7766b9c718c0",
"sha256:3b0036c978cbcc4a4512278e98e3e6d9e6b834dc973206162eddf98b586ef1c6",
"sha256:3ea8c252d8df5e9166bcf3d9edced2af132f4ead8ac422eac723c5781063709a",
"sha256:41608c0acbe0899c852281978492f9ce2c6fbfaf60aff0cefc54a7c4516b822c",
"sha256:59d11674964b74a81b149d4ceaff2b674b3b0e4d0f10f0be1533e49c4a28408b",
"sha256:5e479df4b2d0f8f02133b7e4430098699450e1b2a826438af6bec9a400530957",
"sha256:684850fb1e3e55c9220aad007f8386d8e3e477c4ec9211ae54d968ecdca8c6f9",
"sha256:6ccc43d68b81c424e46192a778f97da94ee0630337c9bbe5b2ecc9b0c1c59001",
"sha256:6d42debaf55450643146fabe4b6817bb2a55b23698b0434107e892a43117285e",
"sha256:710376bf67d8ff4500a31d0c207b8941ff4fba5de6890a701d71680474fe2a60",
"sha256:756ae7efddd68d4ea7d89c636b703e14a0c686688d42f588b90778a3c2fc0564",
"sha256:77149002d9386fae303a4a162e6bce75cc2161347ad2ba06c2f0182561875d45",
"sha256:78e2f18a82b88cbc37d22365cf8d2b879a492faedb3f2975adb4ed8dfe994d3a",
"sha256:7d9b42127a6c0bdcc25c3dcf252bb3ddc70454fac593b1b6933ae091396deb13",
"sha256:8389d6044ee4e2037dca83e3f6994738550f6ee8cfb746762283fad9b932868f",
"sha256:9c1a81af067e72261c9cbe33ea792893e83bc6aa987bfbd6fdc1e5e7b22777c4",
"sha256:c1e0920909d916d3375c7a1fdb0b1c78e46170e8bb42792312b6eb6676b2f87f",
"sha256:c68fdf21c6f3573ae19c7ee65f9ff185649a060c9a06535e9c3a0ee0bbac9235",
"sha256:c733ef3bdcfe52a1a75564389bad4064352274036e7e234730526d155f04d914",
"sha256:c9c58b0b84055d8bc27b7df5a9d141df4ee6ff59821f922dd73155861282f6a3",
"sha256:d03abec50df423b026a5aa09656bd9d37f1e6a49271f123f31f9b8aed5dc3ea3",
"sha256:d2cfac21e31e841d60dc28c0ec7d4ec47a35c608cb8906435d47ef83ffb22150",
"sha256:dcc119db14757b0c7bce64042158307b9b1c76471e655751a61b57f5a0e4d78e",
"sha256:df3a7b258cc230a65245167a202dd07320a5af05f3d41da1488ba0fa05bc9347",
"sha256:df48a623c58180874d7407b4d9ec06a19b84ed47f60a3884345b1a5099c1818b",
"sha256:e1b95972a0ae3f248a899cdbac92ba2e01d731225f566569311043ce2226f5e7",
"sha256:f326b3c1bbfda5b9308252ee0dcb30b612ee92b0e105d4abec70335fab5b1245",
"sha256:f411cb22115cb15452d099fec0ee636b06cf81bfb40ed9c02d30c8dc2bc2e3d1"
],
"index": "pypi",
"version": "==3.7.4.post0"
"version": "==3.7.3"
},
"apscheduler": {
"hashes": [
@ -84,10 +84,10 @@
},
"attrs": {
"hashes": [
"sha256:149e90d6d8ac20db7a955ad60cf0e6881a3f20d37096140088356da6c716b0b1",
"sha256:ef6aaac3ca6cd92904cdd0d83f629a15f18053ec84e6432106f7a4d04ae4f5fb"
"sha256:31b2eced602aa8423c2aea9c76a724617ed67cf9513173fd3a4f03e3a929c7e6",
"sha256:832aa3cde19744e49938b91fea06d69ecb9e649c93ba974535d08ad92164f700"
],
"version": "==21.2.0"
"version": "==20.3.0"
},
"blinker": {
"hashes": [
@ -98,25 +98,17 @@
},
"certifi": {
"hashes": [
"sha256:2bbf76fd432960138b3ef6dda3dde0544f27cbf8546c458e60baf371917ba9ee",
"sha256:50b1e4f8446b06f41be7dd6338db18e0990601dce795c2b1686458aa7e8fa7d8"
"sha256:1a4995114262bffbc2413b159f2a1a480c969de6e6eb13ee966d470af86af59c",
"sha256:719a74fb9e33b9bd44cc7f3a8d94bc35e4049deebe19ba7d8e108280cfd59830"
],
"version": "==2021.5.30"
"version": "==2020.12.5"
},
"chardet": {
"hashes": [
"sha256:0d6f53a15db4120f2b08c94f11e7d93d2c911ee118b6b30a04ec3ee8310179fa",
"sha256:f864054d66fd9118f2e67044ac8981a54775ec5b67aed0441892edb553d21da5"
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==4.0.0"
},
"charset-normalizer": {
"hashes": [
"sha256:88fce3fa5b1a84fdcb3f603d889f723d1dd89b26059d0123ca435570e848d5e1",
"sha256:c46c3ace2d744cfbdebceaa3c19ae691f53ae621b39fd7570f59d14fb7f2fd12"
],
"markers": "python_version >= '3'",
"version": "==2.0.3"
"version": "==3.0.4"
},
"click": {
"hashes": [
@ -135,11 +127,11 @@
},
"ecdsa": {
"hashes": [
"sha256:5cf31d5b33743abe0dfc28999036c849a69d548f994b535e527ee3cb7f3ef676",
"sha256:b9f500bb439e4153d0330610f5d26baaf18d17b8ced1bc54410d189385ea68aa"
"sha256:881fa5e12bb992972d3d1b3d4dfbe149ab76a89f13da02daa5ea1ec7dea6e747",
"sha256:cfc046a2ddd425adbd1a78b3c46f0d1325c657811c0f45ecc3a0a6236c1e50ff"
],
"index": "pypi",
"version": "==0.17.0"
"version": "==0.16.1"
},
"flask": {
"hashes": [
@ -156,13 +148,6 @@
"index": "pypi",
"version": "==0.9.1"
},
"flask-testing": {
"hashes": [
"sha256:0a734d7b68e63a9410b413cd7b1f96456f9a858bd09a6222d465650cc782eb01"
],
"index": "pypi",
"version": "==0.8.1"
},
"gevent": {
"hashes": [
"sha256:16574e4aa902ebc7bad564e25aa9740a82620fdeb61e0bbf5cbc32e84c13cb6a",
@ -199,58 +184,52 @@
},
"greenlet": {
"hashes": [
"sha256:03f28a5ea20201e70ab70518d151116ce939b412961c33827519ce620957d44c",
"sha256:06d7ac89e6094a0a8f8dc46aa61898e9e1aec79b0f8b47b2400dd51a44dbc832",
"sha256:06ecb43b04480e6bafc45cb1b4b67c785e183ce12c079473359e04a709333b08",
"sha256:096cb0217d1505826ba3d723e8981096f2622cde1eb91af9ed89a17c10aa1f3e",
"sha256:0c557c809eeee215b87e8a7cbfb2d783fb5598a78342c29ade561440abae7d22",
"sha256:0de64d419b1cb1bfd4ea544bedea4b535ef3ae1e150b0f2609da14bbf48a4a5f",
"sha256:14927b15c953f8f2d2a8dffa224aa78d7759ef95284d4c39e1745cf36e8cdd2c",
"sha256:16183fa53bc1a037c38d75fdc59d6208181fa28024a12a7f64bb0884434c91ea",
"sha256:206295d270f702bc27dbdbd7651e8ebe42d319139e0d90217b2074309a200da8",
"sha256:22002259e5b7828b05600a762579fa2f8b33373ad95a0ee57b4d6109d0e589ad",
"sha256:2325123ff3a8ecc10ca76f062445efef13b6cf5a23389e2df3c02a4a527b89bc",
"sha256:258f9612aba0d06785143ee1cbf2d7361801c95489c0bd10c69d163ec5254a16",
"sha256:3096286a6072553b5dbd5efbefc22297e9d06a05ac14ba017233fedaed7584a8",
"sha256:3d13da093d44dee7535b91049e44dd2b5540c2a0e15df168404d3dd2626e0ec5",
"sha256:408071b64e52192869129a205e5b463abda36eff0cebb19d6e63369440e4dc99",
"sha256:598bcfd841e0b1d88e32e6a5ea48348a2c726461b05ff057c1b8692be9443c6e",
"sha256:5d928e2e3c3906e0a29b43dc26d9b3d6e36921eee276786c4e7ad9ff5665c78a",
"sha256:5f75e7f237428755d00e7460239a2482fa7e3970db56c8935bd60da3f0733e56",
"sha256:60848099b76467ef09b62b0f4512e7e6f0a2c977357a036de602b653667f5f4c",
"sha256:6b1d08f2e7f2048d77343279c4d4faa7aef168b3e36039cba1917fffb781a8ed",
"sha256:70bd1bb271e9429e2793902dfd194b653221904a07cbf207c3139e2672d17959",
"sha256:76ed710b4e953fc31c663b079d317c18f40235ba2e3d55f70ff80794f7b57922",
"sha256:7920e3eccd26b7f4c661b746002f5ec5f0928076bd738d38d894bb359ce51927",
"sha256:7db68f15486d412b8e2cfcd584bf3b3a000911d25779d081cbbae76d71bd1a7e",
"sha256:8833e27949ea32d27f7e96930fa29404dd4f2feb13cce483daf52e8842ec246a",
"sha256:944fbdd540712d5377a8795c840a97ff71e7f3221d3fddc98769a15a87b36131",
"sha256:9a6b035aa2c5fcf3dbbf0e3a8a5bc75286fc2d4e6f9cfa738788b433ec894919",
"sha256:9bdcff4b9051fb1aa4bba4fceff6a5f770c6be436408efd99b76fc827f2a9319",
"sha256:a9017ff5fc2522e45562882ff481128631bf35da444775bc2776ac5c61d8bcae",
"sha256:aa4230234d02e6f32f189fd40b59d5a968fe77e80f59c9c933384fe8ba535535",
"sha256:ad80bb338cf9f8129c049837a42a43451fc7c8b57ad56f8e6d32e7697b115505",
"sha256:adb94a28225005890d4cf73648b5131e885c7b4b17bc762779f061844aabcc11",
"sha256:b3090631fecdf7e983d183d0fad7ea72cfb12fa9212461a9b708ff7907ffff47",
"sha256:b33b51ab057f8a20b497ffafdb1e79256db0c03ef4f5e3d52e7497200e11f821",
"sha256:b97c9a144bbeec7039cca44df117efcbeed7209543f5695201cacf05ba3b5857",
"sha256:be13a18cec649ebaab835dff269e914679ef329204704869f2f167b2c163a9da",
"sha256:be9768e56f92d1d7cd94185bab5856f3c5589a50d221c166cc2ad5eb134bd1dc",
"sha256:c1580087ab493c6b43e66f2bdd165d9e3c1e86ef83f6c2c44a29f2869d2c5bd5",
"sha256:c35872b2916ab5a240d52a94314c963476c989814ba9b519bc842e5b61b464bb",
"sha256:c70c7dd733a4c56838d1f1781e769081a25fade879510c5b5f0df76956abfa05",
"sha256:c767458511a59f6f597bfb0032a1c82a52c29ae228c2c0a6865cfeaeaac4c5f5",
"sha256:c87df8ae3f01ffb4483c796fe1b15232ce2b219f0b18126948616224d3f658ee",
"sha256:ca1c4a569232c063615f9e70ff9a1e2fee8c66a6fb5caf0f5e8b21a396deec3e",
"sha256:cc407b68e0a874e7ece60f6639df46309376882152345508be94da608cc0b831",
"sha256:da862b8f7de577bc421323714f63276acb2f759ab8c5e33335509f0b89e06b8f",
"sha256:dfe7eac0d253915116ed0cd160a15a88981a1d194c1ef151e862a5c7d2f853d3",
"sha256:ed1377feed808c9c1139bdb6a61bcbf030c236dd288d6fca71ac26906ab03ba6",
"sha256:f42ad188466d946f1b3afc0a9e1a266ac8926461ee0786c06baac6bd71f8a6f3",
"sha256:f92731609d6625e1cc26ff5757db4d32b6b810d2a3363b0ff94ff573e5901f6f"
"sha256:0a77691f0080c9da8dfc81e23f4e3cffa5accf0f5b56478951016d7cfead9196",
"sha256:0ddd77586553e3daf439aa88b6642c5f252f7ef79a39271c25b1d4bf1b7cbb85",
"sha256:111cfd92d78f2af0bc7317452bd93a477128af6327332ebf3c2be7df99566683",
"sha256:122c63ba795fdba4fc19c744df6277d9cfd913ed53d1a286f12189a0265316dd",
"sha256:181300f826625b7fd1182205b830642926f52bd8cdb08b34574c9d5b2b1813f7",
"sha256:1a1ada42a1fd2607d232ae11a7b3195735edaa49ea787a6d9e6a53afaf6f3476",
"sha256:1bb80c71de788b36cefb0c3bb6bfab306ba75073dbde2829c858dc3ad70f867c",
"sha256:1d1d4473ecb1c1d31ce8fd8d91e4da1b1f64d425c1dc965edc4ed2a63cfa67b2",
"sha256:292e801fcb3a0b3a12d8c603c7cf340659ea27fd73c98683e75800d9fd8f704c",
"sha256:2c65320774a8cd5fdb6e117c13afa91c4707548282464a18cf80243cf976b3e6",
"sha256:4365eccd68e72564c776418c53ce3c5af402bc526fe0653722bc89efd85bf12d",
"sha256:5352c15c1d91d22902582e891f27728d8dac3bd5e0ee565b6a9f575355e6d92f",
"sha256:58ca0f078d1c135ecf1879d50711f925ee238fe773dfe44e206d7d126f5bc664",
"sha256:5d4030b04061fdf4cbc446008e238e44936d77a04b2b32f804688ad64197953c",
"sha256:5d69bbd9547d3bc49f8a545db7a0bd69f407badd2ff0f6e1a163680b5841d2b0",
"sha256:5f297cb343114b33a13755032ecf7109b07b9a0020e841d1c3cedff6602cc139",
"sha256:62afad6e5fd70f34d773ffcbb7c22657e1d46d7fd7c95a43361de979f0a45aef",
"sha256:647ba1df86d025f5a34043451d7c4a9f05f240bee06277a524daad11f997d1e7",
"sha256:719e169c79255816cdcf6dccd9ed2d089a72a9f6c42273aae12d55e8d35bdcf8",
"sha256:7cd5a237f241f2764324396e06298b5dee0df580cf06ef4ada0ff9bff851286c",
"sha256:875d4c60a6299f55df1c3bb870ebe6dcb7db28c165ab9ea6cdc5d5af36bb33ce",
"sha256:90b6a25841488cf2cb1c8623a53e6879573010a669455046df5f029d93db51b7",
"sha256:94620ed996a7632723a424bccb84b07e7b861ab7bb06a5aeb041c111dd723d36",
"sha256:b5f1b333015d53d4b381745f5de842f19fe59728b65f0fbb662dafbe2018c3a5",
"sha256:c5b22b31c947ad8b6964d4ed66776bcae986f73669ba50620162ba7c832a6b6a",
"sha256:c93d1a71c3fe222308939b2e516c07f35a849c5047f0197442a4d6fbcb4128ee",
"sha256:cdb90267650c1edb54459cdb51dab865f6c6594c3a47ebd441bc493360c7af70",
"sha256:cfd06e0f0cc8db2a854137bd79154b61ecd940dce96fad0cba23fe31de0b793c",
"sha256:d3789c1c394944084b5e57c192889985a9f23bd985f6d15728c745d380318128",
"sha256:da7d09ad0f24270b20f77d56934e196e982af0d0a2446120cb772be4e060e1a2",
"sha256:df3e83323268594fa9755480a442cabfe8d82b21aba815a71acf1bb6c1776218",
"sha256:df8053867c831b2643b2c489fe1d62049a98566b1646b194cc815f13e27b90df",
"sha256:e1128e022d8dce375362e063754e129750323b67454cac5600008aad9f54139e",
"sha256:e6e9fdaf6c90d02b95e6b0709aeb1aba5affbbb9ccaea5502f8638e4323206be",
"sha256:eac8803c9ad1817ce3d8d15d1bb82c2da3feda6bee1153eec5c58fa6e5d3f770",
"sha256:eb333b90036358a0e2c57373f72e7648d7207b76ef0bd00a4f7daad1f79f5203",
"sha256:ed1d1351f05e795a527abc04a0d82e9aecd3bdf9f46662c36ff47b0b00ecaf06",
"sha256:f3dc68272990849132d6698f7dc6df2ab62a88b0d36e54702a8fd16c0490e44f",
"sha256:f59eded163d9752fd49978e0bab7a1ff21b1b8d25c05f0995d140cc08ac83379",
"sha256:f5e2d36c86c7b03c94b8459c3bd2c9fe2c7dab4b258b8885617d44a22e453fb7",
"sha256:f6f65bf54215e4ebf6b01e4bb94c49180a589573df643735107056f7a910275b",
"sha256:f8450d5ef759dbe59f84f2c9f77491bb3d3c44bc1a573746daf086e70b14c243",
"sha256:f97d83049715fd9dec7911860ecf0e17b48d8725de01e45de07d8ac0bd5bc378"
],
"markers": "platform_python_implementation == 'CPython'",
"version": "==1.1.0"
"version": "==1.0.0"
},
"gunicorn": {
"hashes": [
@ -262,11 +241,10 @@
},
"idna": {
"hashes": [
"sha256:14475042e284991034cb48e06f6851428fb14c4dc953acd9be9a5e95c7b6dd7a",
"sha256:467fbad99067910785144ce333826c71fb0e63a425657295239737f7ecd125f3"
"sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6",
"sha256:b97d804b1e9b523befed77c48dacec60e6dcb0b5391d57af6a65a312a90648c0"
],
"markers": "python_version >= '3'",
"version": "==3.2"
"version": "==2.10"
},
"isort": {
"hashes": [
@ -416,28 +394,34 @@
},
"matplotlib": {
"hashes": [
"sha256:0bea5ec5c28d49020e5d7923c2725b837e60bc8be99d3164af410eb4b4c827da",
"sha256:1c1779f7ab7d8bdb7d4c605e6ffaa0614b3e80f1e3c8ccf7b9269a22dbc5986b",
"sha256:21b31057bbc5e75b08e70a43cefc4c0b2c2f1b1a850f4a0f7af044eb4163086c",
"sha256:32fa638cc10886885d1ca3d409d4473d6a22f7ceecd11322150961a70fab66dd",
"sha256:3a5c18dbd2c7c366da26a4ad1462fe3e03a577b39e3b503bbcf482b9cdac093c",
"sha256:5826f56055b9b1c80fef82e326097e34dc4af8c7249226b7dd63095a686177d1",
"sha256:6382bc6e2d7e481bcd977eb131c31dee96e0fb4f9177d15ec6fb976d3b9ace1a",
"sha256:6475d0209024a77f869163ec3657c47fed35d9b6ed8bccba8aa0f0099fbbdaa8",
"sha256:6a6a44f27aabe720ec4fd485061e8a35784c2b9ffa6363ad546316dfc9cea04e",
"sha256:7a58f3d8fe8fac3be522c79d921c9b86e090a59637cb88e3bc51298d7a2c862a",
"sha256:7ad19f3fb6145b9eb41c08e7cbb9f8e10b91291396bee21e9ce761bb78df63ec",
"sha256:85f191bb03cb1a7b04b5c2cca4792bef94df06ef473bc49e2818105671766fee",
"sha256:956c8849b134b4a343598305a3ca1bdd3094f01f5efc8afccdebeffe6b315247",
"sha256:a9d8cb5329df13e0cdaa14b3b43f47b5e593ec637f13f14db75bb16e46178b05",
"sha256:b1d5a2cedf5de05567c441b3a8c2651fbde56df08b82640e7f06c8cd91e201f6",
"sha256:b26535b9de85326e6958cdef720ecd10bcf74a3f4371bf9a7e5b2e659c17e153",
"sha256:c541ee5a3287efe066bbe358320853cf4916bc14c00c38f8f3d8d75275a405a9",
"sha256:d8d994cefdff9aaba45166eb3de4f5211adb4accac85cbf97137e98f26ea0219",
"sha256:df815378a754a7edd4559f8c51fc7064f779a74013644a7f5ac7a0c31f875866"
"sha256:1de0bb6cbfe460725f0e97b88daa8643bcf9571c18ba90bb8e41432aaeca91d6",
"sha256:1e850163579a8936eede29fad41e202b25923a0a8d5ffd08ce50fc0a97dcdc93",
"sha256:215e2a30a2090221a9481db58b770ce56b8ef46f13224ae33afe221b14b24dc1",
"sha256:348e6032f666ffd151b323342f9278b16b95d4a75dfacae84a11d2829a7816ae",
"sha256:3d2eb9c1cc254d0ffa90bc96fde4b6005d09c2228f99dfd493a4219c1af99644",
"sha256:3e477db76c22929e4c6876c44f88d790aacdf3c3f8f3a90cb1975c0bf37825b0",
"sha256:451cc89cb33d6652c509fc6b588dc51c41d7246afdcc29b8624e256b7663ed1f",
"sha256:46b1a60a04e6d884f0250d5cc8dc7bd21a9a96c584a7acdaab44698a44710bab",
"sha256:5f571b92a536206f7958f7cb2d367ff6c9a1fa8229dc35020006e4cdd1ca0acd",
"sha256:672960dd114e342b7c610bf32fb99d14227f29919894388b41553217457ba7ef",
"sha256:7310e353a4a35477c7f032409966920197d7df3e757c7624fd842f3eeb307d3d",
"sha256:746a1df55749629e26af7f977ea426817ca9370ad1569436608dc48d1069b87c",
"sha256:7c155437ae4fd366e2700e2716564d1787700687443de46bcb895fe0f84b761d",
"sha256:9265ae0fb35e29f9b8cc86c2ab0a2e3dcddc4dd9de4b85bf26c0f63fe5c1c2ca",
"sha256:94bdd1d55c20e764d8aea9d471d2ae7a7b2c84445e0fa463f02e20f9730783e1",
"sha256:9a79e5dd7bb797aa611048f5b70588b23c5be05b63eefd8a0d152ac77c4243db",
"sha256:a17f0a10604fac7627ec82820439e7db611722e80c408a726cd00d8c974c2fb3",
"sha256:a1acb72f095f1d58ecc2538ed1b8bca0b57df313b13db36ed34b8cdf1868e674",
"sha256:aa49571d8030ad0b9ac39708ee77bd2a22f87815e12bdee52ecaffece9313ed8",
"sha256:c24c05f645aef776e8b8931cb81e0f1632d229b42b6d216e30836e2e145a2b40",
"sha256:cf3a7e54eff792f0815dbbe9b85df2f13d739289c93d346925554f71d484be78",
"sha256:d738acfdfb65da34c91acbdb56abed46803db39af259b7f194dc96920360dbe4",
"sha256:e15fa23d844d54e7b3b7243afd53b7567ee71c721f592deb0727ee85e668f96a",
"sha256:ed4a9e6dcacba56b17a0a9ac22ae2c72a35b7f0ef0693aa68574f0b2df607a89",
"sha256:f44149a0ef5b4991aaef12a93b8e8d66d6412e762745fea1faa61d98524e0ba9"
],
"index": "pypi",
"version": "==3.4.2"
"version": "==3.3.4"
},
"mccabe": {
"hashes": [
@ -499,80 +483,69 @@
},
"numpy": {
"hashes": [
"sha256:01721eefe70544d548425a07c80be8377096a54118070b8a62476866d5208e33",
"sha256:0318c465786c1f63ac05d7c4dbcecd4d2d7e13f0959b01b534ea1e92202235c5",
"sha256:05a0f648eb28bae4bcb204e6fd14603de2908de982e761a2fc78efe0f19e96e1",
"sha256:1412aa0aec3e00bc23fbb8664d76552b4efde98fb71f60737c83efbac24112f1",
"sha256:25b40b98ebdd272bc3020935427a4530b7d60dfbe1ab9381a39147834e985eac",
"sha256:2d4d1de6e6fb3d28781c73fbde702ac97f03d79e4ffd6598b880b2d95d62ead4",
"sha256:38e8648f9449a549a7dfe8d8755a5979b45b3538520d1e735637ef28e8c2dc50",
"sha256:4a3d5fb89bfe21be2ef47c0614b9c9c707b7362386c9a3ff1feae63e0267ccb6",
"sha256:635e6bd31c9fb3d475c8f44a089569070d10a9ef18ed13738b03049280281267",
"sha256:73101b2a1fef16602696d133db402a7e7586654682244344b8329cdcbbb82172",
"sha256:791492091744b0fe390a6ce85cc1bf5149968ac7d5f0477288f78c89b385d9af",
"sha256:7a708a79c9a9d26904d1cca8d383bf869edf6f8e7650d85dbc77b041e8c5a0f8",
"sha256:88c0b89ad1cc24a5efbb99ff9ab5db0f9a86e9cc50240177a571fbe9c2860ac2",
"sha256:8a326af80e86d0e9ce92bcc1e65c8ff88297de4fa14ee936cb2293d414c9ec63",
"sha256:8a92c5aea763d14ba9d6475803fc7904bda7decc2a0a68153f587ad82941fec1",
"sha256:91c6f5fc58df1e0a3cc0c3a717bb3308ff850abdaa6d2d802573ee2b11f674a8",
"sha256:95b995d0c413f5d0428b3f880e8fe1660ff9396dcd1f9eedbc311f37b5652e16",
"sha256:9749a40a5b22333467f02fe11edc98f022133ee1bfa8ab99bda5e5437b831214",
"sha256:978010b68e17150db8765355d1ccdd450f9fc916824e8c4e35ee620590e234cd",
"sha256:9a513bd9c1551894ee3d31369f9b07460ef223694098cf27d399513415855b68",
"sha256:a75b4498b1e93d8b700282dc8e655b8bd559c0904b3910b144646dbbbc03e062",
"sha256:c6a2324085dd52f96498419ba95b5777e40b6bcbc20088fddb9e8cbb58885e8e",
"sha256:d7a4aeac3b94af92a9373d6e77b37691b86411f9745190d2c351f410ab3a791f",
"sha256:d9e7912a56108aba9b31df688a4c4f5cb0d9d3787386b87d504762b6754fbb1b",
"sha256:dff4af63638afcc57a3dfb9e4b26d434a7a602d225b42d746ea7fe2edf1342fd",
"sha256:e46ceaff65609b5399163de5893d8f2a82d3c77d5e56d976c8b5fb01faa6b671",
"sha256:f01f28075a92eede918b965e86e8f0ba7b7797a95aa8d35e1cc8821f5fc3ad6a",
"sha256:fd7d7409fa643a91d0a05c7554dd68aa9c9bb16e186f6ccfe40d6e003156e33a"
"sha256:032be656d89bbf786d743fee11d01ef318b0781281241997558fa7950028dd29",
"sha256:104f5e90b143dbf298361a99ac1af4cf59131218a045ebf4ee5990b83cff5fab",
"sha256:125a0e10ddd99a874fd357bfa1b636cd58deb78ba4a30b5ddb09f645c3512e04",
"sha256:12e4ba5c6420917571f1a5becc9338abbde71dd811ce40b37ba62dec7b39af6d",
"sha256:13adf545732bb23a796914fe5f891a12bd74cf3d2986eed7b7eba2941eea1590",
"sha256:2d7e27442599104ee08f4faed56bb87c55f8b10a5494ac2ead5c98a4b289e61f",
"sha256:3bc63486a870294683980d76ec1e3efc786295ae00128f9ea38e2c6e74d5a60a",
"sha256:3d3087e24e354c18fb35c454026af3ed8997cfd4997765266897c68d724e4845",
"sha256:4ed8e96dc146e12c1c5cdd6fb9fd0757f2ba66048bf94c5126b7efebd12d0090",
"sha256:60759ab15c94dd0e1ed88241fd4fa3312db4e91d2c8f5a2d4cf3863fad83d65b",
"sha256:65410c7f4398a0047eea5cca9b74009ea61178efd78d1be9847fac1d6716ec1e",
"sha256:66b467adfcf628f66ea4ac6430ded0614f5cc06ba530d09571ea404789064adc",
"sha256:7199109fa46277be503393be9250b983f325880766f847885607d9b13848f257",
"sha256:72251e43ac426ff98ea802a931922c79b8d7596480300eb9f1b1e45e0543571e",
"sha256:89e5336f2bec0c726ac7e7cdae181b325a9c0ee24e604704ed830d241c5e47ff",
"sha256:89f937b13b8dd17b0099c7c2e22066883c86ca1575a975f754babc8fbf8d69a9",
"sha256:9c94cab5054bad82a70b2e77741271790304651d584e2cdfe2041488e753863b",
"sha256:9eb551d122fadca7774b97db8a112b77231dcccda8e91a5bc99e79890797175e",
"sha256:a1d7995d1023335e67fb070b2fae6f5968f5be3802b15ad6d79d81ecaa014fe0",
"sha256:ae61f02b84a0211abb56462a3b6cd1e7ec39d466d3160eb4e1da8bf6717cdbeb",
"sha256:b9410c0b6fed4a22554f072a86c361e417f0258838957b78bd063bde2c7f841f",
"sha256:c26287dfc888cf1e65181f39ea75e11f42ffc4f4529e5bd19add57ad458996e2",
"sha256:c91ec9569facd4757ade0888371eced2ecf49e7982ce5634cc2cf4e7331a4b14",
"sha256:ecb5b74c702358cdc21268ff4c37f7466357871f53a30e6f84c686952bef16a9"
],
"version": "==1.21.1"
"version": "==1.20.1"
},
"pillow": {
"hashes": [
"sha256:0b2efa07f69dc395d95bb9ef3299f4ca29bcb2157dc615bae0b42c3c20668ffc",
"sha256:114f816e4f73f9ec06997b2fde81a92cbf0777c9e8f462005550eed6bae57e63",
"sha256:147bd9e71fb9dcf08357b4d530b5167941e222a6fd21f869c7911bac40b9994d",
"sha256:15a2808e269a1cf2131930183dcc0419bc77bb73eb54285dde2706ac9939fa8e",
"sha256:196560dba4da7a72c5e7085fccc5938ab4075fd37fe8b5468869724109812edd",
"sha256:1c03e24be975e2afe70dfc5da6f187eea0b49a68bb2b69db0f30a61b7031cee4",
"sha256:1fd5066cd343b5db88c048d971994e56b296868766e461b82fa4e22498f34d77",
"sha256:29c9569049d04aaacd690573a0398dbd8e0bf0255684fee512b413c2142ab723",
"sha256:2b6dfa068a8b6137da34a4936f5a816aba0ecc967af2feeb32c4393ddd671cba",
"sha256:2cac53839bfc5cece8fdbe7f084d5e3ee61e1303cccc86511d351adcb9e2c792",
"sha256:2ee77c14a0299d0541d26f3d8500bb57e081233e3fa915fa35abd02c51fa7fae",
"sha256:37730f6e68bdc6a3f02d2079c34c532330d206429f3cee651aab6b66839a9f0e",
"sha256:3f08bd8d785204149b5b33e3b5f0ebbfe2190ea58d1a051c578e29e39bfd2367",
"sha256:479ab11cbd69612acefa8286481f65c5dece2002ffaa4f9db62682379ca3bb77",
"sha256:4bc3c7ef940eeb200ca65bd83005eb3aae8083d47e8fcbf5f0943baa50726856",
"sha256:660a87085925c61a0dcc80efb967512ac34dbb256ff7dd2b9b4ee8dbdab58cf4",
"sha256:67b3666b544b953a2777cb3f5a922e991be73ab32635666ee72e05876b8a92de",
"sha256:70af7d222df0ff81a2da601fab42decb009dc721545ed78549cb96e3a1c5f0c8",
"sha256:75e09042a3b39e0ea61ce37e941221313d51a9c26b8e54e12b3ececccb71718a",
"sha256:8960a8a9f4598974e4c2aeb1bff9bdd5db03ee65fd1fce8adf3223721aa2a636",
"sha256:9364c81b252d8348e9cc0cb63e856b8f7c1b340caba6ee7a7a65c968312f7dab",
"sha256:969cc558cca859cadf24f890fc009e1bce7d7d0386ba7c0478641a60199adf79",
"sha256:9a211b663cf2314edbdb4cf897beeb5c9ee3810d1d53f0e423f06d6ebbf9cd5d",
"sha256:a17ca41f45cf78c2216ebfab03add7cc350c305c38ff34ef4eef66b7d76c5229",
"sha256:a2f381932dca2cf775811a008aa3027671ace723b7a38838045b1aee8669fdcf",
"sha256:a4eef1ff2d62676deabf076f963eda4da34b51bc0517c70239fafed1d5b51500",
"sha256:c088a000dfdd88c184cc7271bfac8c5b82d9efa8637cd2b68183771e3cf56f04",
"sha256:c0e0550a404c69aab1e04ae89cca3e2a042b56ab043f7f729d984bf73ed2a093",
"sha256:c11003197f908878164f0e6da15fce22373ac3fc320cda8c9d16e6bba105b844",
"sha256:c2a5ff58751670292b406b9f06e07ed1446a4b13ffced6b6cab75b857485cbc8",
"sha256:c35d09db702f4185ba22bb33ef1751ad49c266534339a5cebeb5159d364f6f82",
"sha256:c379425c2707078dfb6bfad2430728831d399dc95a7deeb92015eb4c92345eaf",
"sha256:cc866706d56bd3a7dbf8bac8660c6f6462f2f2b8a49add2ba617bc0c54473d83",
"sha256:d0da39795049a9afcaadec532e7b669b5ebbb2a9134576ebcc15dd5bdae33cc0",
"sha256:f156d6ecfc747ee111c167f8faf5f4953761b5e66e91a4e6767e548d0f80129c",
"sha256:f4ebde71785f8bceb39dcd1e7f06bcc5d5c3cf48b9f69ab52636309387b097c8",
"sha256:fc214a6b75d2e0ea7745488da7da3c381f41790812988c7a92345978414fad37",
"sha256:fd7eef578f5b2200d066db1b50c4aa66410786201669fb76d5238b007918fb24",
"sha256:ff04c373477723430dce2e9d024c708a047d44cf17166bf16e604b379bf0ca14"
"sha256:165c88bc9d8dba670110c689e3cc5c71dbe4bfb984ffa7cbebf1fac9554071d6",
"sha256:1d208e670abfeb41b6143537a681299ef86e92d2a3dac299d3cd6830d5c7bded",
"sha256:22d070ca2e60c99929ef274cfced04294d2368193e935c5d6febfd8b601bf865",
"sha256:2353834b2c49b95e1313fb34edf18fca4d57446675d05298bb694bca4b194174",
"sha256:39725acf2d2e9c17356e6835dccebe7a697db55f25a09207e38b835d5e1bc032",
"sha256:3de6b2ee4f78c6b3d89d184ade5d8fa68af0848f9b6b6da2b9ab7943ec46971a",
"sha256:47c0d93ee9c8b181f353dbead6530b26980fe4f5485aa18be8f1fd3c3cbc685e",
"sha256:5e2fe3bb2363b862671eba632537cd3a823847db4d98be95690b7e382f3d6378",
"sha256:604815c55fd92e735f9738f65dabf4edc3e79f88541c221d292faec1904a4b17",
"sha256:6c5275bd82711cd3dcd0af8ce0bb99113ae8911fc2952805f1d012de7d600a4c",
"sha256:731ca5aabe9085160cf68b2dbef95fc1991015bc0a3a6ea46a371ab88f3d0913",
"sha256:7612520e5e1a371d77e1d1ca3a3ee6227eef00d0a9cddb4ef7ecb0b7396eddf7",
"sha256:7916cbc94f1c6b1301ac04510d0881b9e9feb20ae34094d3615a8a7c3db0dcc0",
"sha256:81c3fa9a75d9f1afafdb916d5995633f319db09bd773cb56b8e39f1e98d90820",
"sha256:887668e792b7edbfb1d3c9d8b5d8c859269a0f0eba4dda562adb95500f60dbba",
"sha256:93a473b53cc6e0b3ce6bf51b1b95b7b1e7e6084be3a07e40f79b42e83503fbf2",
"sha256:96d4dc103d1a0fa6d47c6c55a47de5f5dafd5ef0114fa10c85a1fd8e0216284b",
"sha256:a3d3e086474ef12ef13d42e5f9b7bbf09d39cf6bd4940f982263d6954b13f6a9",
"sha256:b02a0b9f332086657852b1f7cb380f6a42403a6d9c42a4c34a561aa4530d5234",
"sha256:b09e10ec453de97f9a23a5aa5e30b334195e8d2ddd1ce76cc32e52ba63c8b31d",
"sha256:b6f00ad5ebe846cc91763b1d0c6d30a8042e02b2316e27b05de04fa6ec831ec5",
"sha256:bba80df38cfc17f490ec651c73bb37cd896bc2400cfba27d078c2135223c1206",
"sha256:c3d911614b008e8a576b8e5303e3db29224b455d3d66d1b2848ba6ca83f9ece9",
"sha256:ca20739e303254287138234485579b28cb0d524401f83d5129b5ff9d606cb0a8",
"sha256:cb192176b477d49b0a327b2a5a4979552b7a58cd42037034316b8018ac3ebb59",
"sha256:cdbbe7dff4a677fb555a54f9bc0450f2a21a93c5ba2b44e09e54fcb72d2bd13d",
"sha256:cf6e33d92b1526190a1de904df21663c46a456758c0424e4f947ae9aa6088bf7",
"sha256:d355502dce85ade85a2511b40b4c61a128902f246504f7de29bbeec1ae27933a",
"sha256:d673c4990acd016229a5c1c4ee8a9e6d8f481b27ade5fc3d95938697fa443ce0",
"sha256:dc577f4cfdda354db3ae37a572428a90ffdbe4e51eda7849bf442fb803f09c9b",
"sha256:dd9eef866c70d2cbbea1ae58134eaffda0d4bfea403025f4db6859724b18ab3d",
"sha256:f50e7a98b0453f39000619d845be8b06e611e56ee6e8186f7f60c3b1e2f0feae"
],
"version": "==8.3.1"
"version": "==8.1.0"
},
"psycopg2": {
"hashes": [
@ -610,18 +583,18 @@
},
"python-dateutil": {
"hashes": [
"sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86",
"sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"
"sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c",
"sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a"
],
"version": "==2.8.2"
"version": "==2.8.1"
},
"python-dotenv": {
"hashes": [
"sha256:dd8fe852847f4fbfadabf6183ddd4c824a9651f02d51714fa075c95561959c7d",
"sha256:effaac3c1e58d89b3ccb4d04a40dc7ad6e0275fda25fd75ae9d323e2465e202d"
"sha256:0c8d1b80d1a1e91717ea7d526178e3882732420b03f08afea0406db6402e220e",
"sha256:587825ed60b1711daea4832cf37524dfd404325b7db5e25ebe88c495c9f807a0"
],
"index": "pypi",
"version": "==0.18.0"
"version": "==0.15.0"
},
"pytz": {
"hashes": [
@ -632,12 +605,12 @@
},
"requests": {
"hashes": [
"sha256:6c1246513ecd5ecd4528a0906f910e8f0f9c6b8ec72030dc9fd154dc1a6efd24",
"sha256:b8aa58f8cf793ffd8782d3d8cb19e66ef36f7aba4353eec859e74678b01b07a7"
"sha256:27973dd4a904a4f13b263a19c866c13b92a39ed1c964655f025f3f8d3d75b804",
"sha256:c210084e36a42ae6b9219e00e48287def368a26d03a048ddad7bfee44f75871e"
],
"index": "pypi",
"markers": "python_version >= '3.0'",
"version": "==2.26.0"
"version": "==2.25.1"
},
"six": {
"hashes": [
@ -649,11 +622,11 @@
},
"stripe": {
"hashes": [
"sha256:0050763cb67df6745973bd9757f7a765bed1b82b5d5261fb8908cfc6ec9e5200",
"sha256:8966b7793014380f60c6f121ba333d6f333a55818edaf79c8d70464ce0a7a808"
"sha256:97431dbc6d25b94816816a3606f75045d79f101db4cf27e79ac4e039e4971d73",
"sha256:e32c68194a47522a10945eb893218e5cb5ee65e3a3c2c4df7efca117a6bf1902"
],
"index": "pypi",
"version": "==2.60.0"
"version": "==2.55.2"
},
"toml": {
"hashes": [
@ -702,11 +675,11 @@
},
"typing-extensions": {
"hashes": [
"sha256:0ac0f89795dd19de6b97debb0c6af1c70987fd80a2d62d1958f7e56fcc31b497",
"sha256:50b6f157849174217d0656f99dc82fe932884fb250826c18350e159ec6cdf342",
"sha256:779383f6086d90c99ae41cf0ff39aac8a7937a9283ce0a414e5dd782f4c94a84"
"sha256:7cb407020f00f7bfc3cb3e7881628838e69d8f3fcab2f64742a5e76b2f841918",
"sha256:99d4073b617d30288f569d3f13d2bd7548c3a7e4c8de87db09a9d29bb3a4a60c",
"sha256:dafc7639cde7f1b6e1acc0f457842a83e722ccca8eef5270af2d74792619a89f"
],
"version": "==3.10.0.0"
"version": "==3.7.4.3"
},
"tzlocal": {
"hashes": [
@ -717,10 +690,10 @@
},
"urllib3": {
"hashes": [
"sha256:39fb8672126159acb139a7718dd10806104dec1e2f0f6c88aab05d17df10c8d4",
"sha256:f57b4c16c62fa2760b7e3d97c35b255512fb6b59a259730f36ba32ce9f8e342f"
"sha256:1b465e494e3e0d8939b50680403e3aedaa2bc434b7d5af64dfd3c958d7f5ae80",
"sha256:de3eedaad74a2683334e282005cd8d7f22f4d55fa690a2a1020a416cb0a47e73"
],
"version": "==1.26.6"
"version": "==1.26.3"
},
"werkzeug": {
"hashes": [
@ -788,59 +761,60 @@
},
"zope.interface": {
"hashes": [
"sha256:08f9636e99a9d5410181ba0729e0408d3d8748026ea938f3b970a0249daa8192",
"sha256:0b465ae0962d49c68aa9733ba92a001b2a0933c317780435f00be7ecb959c702",
"sha256:0cba8477e300d64a11a9789ed40ee8932b59f9ee05f85276dbb4b59acee5dd09",
"sha256:0cee5187b60ed26d56eb2960136288ce91bcf61e2a9405660d271d1f122a69a4",
"sha256:0ea1d73b7c9dcbc5080bb8aaffb776f1c68e807767069b9ccdd06f27a161914a",
"sha256:0f91b5b948686659a8e28b728ff5e74b1be6bf40cb04704453617e5f1e945ef3",
"sha256:15e7d1f7a6ee16572e21e3576d2012b2778cbacf75eb4b7400be37455f5ca8bf",
"sha256:17776ecd3a1fdd2b2cd5373e5ef8b307162f581c693575ec62e7c5399d80794c",
"sha256:194d0bcb1374ac3e1e023961610dc8f2c78a0f5f634d0c737691e215569e640d",
"sha256:1c0e316c9add0db48a5b703833881351444398b04111188069a26a61cfb4df78",
"sha256:205e40ccde0f37496904572035deea747390a8b7dc65146d30b96e2dd1359a83",
"sha256:273f158fabc5ea33cbc936da0ab3d4ba80ede5351babc4f577d768e057651531",
"sha256:2876246527c91e101184f63ccd1d716ec9c46519cc5f3d5375a3351c46467c46",
"sha256:2c98384b254b37ce50eddd55db8d381a5c53b4c10ee66e1e7fe749824f894021",
"sha256:2e5a26f16503be6c826abca904e45f1a44ff275fdb7e9d1b75c10671c26f8b94",
"sha256:334701327f37c47fa628fc8b8d28c7d7730ce7daaf4bda1efb741679c2b087fc",
"sha256:3748fac0d0f6a304e674955ab1365d515993b3a0a865e16a11ec9d86fb307f63",
"sha256:3c02411a3b62668200910090a0dff17c0b25aaa36145082a5a6adf08fa281e54",
"sha256:3dd4952748521205697bc2802e4afac5ed4b02909bb799ba1fe239f77fd4e117",
"sha256:3f24df7124c323fceb53ff6168da70dbfbae1442b4f3da439cd441681f54fe25",
"sha256:469e2407e0fe9880ac690a3666f03eb4c3c444411a5a5fddfdabc5d184a79f05",
"sha256:4de4bc9b6d35c5af65b454d3e9bc98c50eb3960d5a3762c9438df57427134b8e",
"sha256:5208ebd5152e040640518a77827bdfcc73773a15a33d6644015b763b9c9febc1",
"sha256:52de7fc6c21b419078008f697fd4103dbc763288b1406b4562554bd47514c004",
"sha256:5bb3489b4558e49ad2c5118137cfeaf59434f9737fa9c5deefc72d22c23822e2",
"sha256:5dba5f530fec3f0988d83b78cc591b58c0b6eb8431a85edd1569a0539a8a5a0e",
"sha256:5dd9ca406499444f4c8299f803d4a14edf7890ecc595c8b1c7115c2342cadc5f",
"sha256:5f931a1c21dfa7a9c573ec1f50a31135ccce84e32507c54e1ea404894c5eb96f",
"sha256:63b82bb63de7c821428d513607e84c6d97d58afd1fe2eb645030bdc185440120",
"sha256:66c0061c91b3b9cf542131148ef7ecbecb2690d48d1612ec386de9d36766058f",
"sha256:6f0c02cbb9691b7c91d5009108f975f8ffeab5dff8f26d62e21c493060eff2a1",
"sha256:71aace0c42d53abe6fc7f726c5d3b60d90f3c5c055a447950ad6ea9cec2e37d9",
"sha256:7d97a4306898b05404a0dcdc32d9709b7d8832c0c542b861d9a826301719794e",
"sha256:7df1e1c05304f26faa49fa752a8c690126cf98b40b91d54e6e9cc3b7d6ffe8b7",
"sha256:8270252effc60b9642b423189a2fe90eb6b59e87cbee54549db3f5562ff8d1b8",
"sha256:867a5ad16892bf20e6c4ea2aab1971f45645ff3102ad29bd84c86027fa99997b",
"sha256:877473e675fdcc113c138813a5dd440da0769a2d81f4d86614e5d62b69497155",
"sha256:8892f89999ffd992208754851e5a052f6b5db70a1e3f7d54b17c5211e37a98c7",
"sha256:9a9845c4c6bb56e508651f005c4aeb0404e518c6f000d5a1123ab077ab769f5c",
"sha256:a1e6e96217a0f72e2b8629e271e1b280c6fa3fe6e59fa8f6701bec14e3354325",
"sha256:a8156e6a7f5e2a0ff0c5b21d6bcb45145efece1909efcbbbf48c56f8da68221d",
"sha256:a9506a7e80bcf6eacfff7f804c0ad5350c8c95b9010e4356a4b36f5322f09abb",
"sha256:af310ec8335016b5e52cae60cda4a4f2a60a788cbb949a4fbea13d441aa5a09e",
"sha256:b0297b1e05fd128d26cc2460c810d42e205d16d76799526dfa8c8ccd50e74959",
"sha256:bf68f4b2b6683e52bec69273562df15af352e5ed25d1b6641e7efddc5951d1a7",
"sha256:d0c1bc2fa9a7285719e5678584f6b92572a5b639d0e471bb8d4b650a1a910920",
"sha256:d4d9d6c1a455d4babd320203b918ccc7fcbefe308615c521062bc2ba1aa4d26e",
"sha256:db1fa631737dab9fa0b37f3979d8d2631e348c3b4e8325d6873c2541d0ae5a48",
"sha256:dd93ea5c0c7f3e25335ab7d22a507b1dc43976e1345508f845efc573d3d779d8",
"sha256:f44e517131a98f7a76696a7b21b164bcb85291cee106a23beccce454e1f433a4",
"sha256:f7ee479e96f7ee350db1cf24afa5685a5899e2b34992fb99e1f7c1b0b758d263"
"sha256:05a97ba92c1c7c26f25c9f671aa1ef85ffead6cdad13770e5b689cf983adc7e1",
"sha256:07d61722dd7d85547b7c6b0f5486b4338001fab349f2ac5cabc0b7182eb3425d",
"sha256:0a990dcc97806e5980bbb54b2e46b9cde9e48932d8e6984daf71ef1745516123",
"sha256:150e8bcb7253a34a4535aeea3de36c0bb3b1a6a47a183a95d65a194b3e07f232",
"sha256:1743bcfe45af8846b775086471c28258f4c6e9ee8ef37484de4495f15a98b549",
"sha256:1b5f6c8fff4ed32aa2dd43e84061bc8346f32d3ba6ad6e58f088fe109608f102",
"sha256:21e49123f375703cf824214939d39df0af62c47d122d955b2a8d9153ea08cfd5",
"sha256:21f579134a47083ffb5ddd1307f0405c91aa8b61ad4be6fd5af0171474fe0c45",
"sha256:27c267dc38a0f0079e96a2945ee65786d38ef111e413c702fbaaacbab6361d00",
"sha256:299bde0ab9e5c4a92f01a152b7fbabb460f31343f1416f9b7b983167ab1e33bc",
"sha256:2ab88d8f228f803fcb8cb7d222c579d13dab2d3622c51e8cf321280da01102a7",
"sha256:2ced4c35061eea623bc84c7711eedce8ecc3c2c51cd9c6afa6290df3bae9e104",
"sha256:2dcab01c660983ba5e5a612e0c935141ccbee67d2e2e14b833e01c2354bd8034",
"sha256:32546af61a9a9b141ca38d971aa6eb9800450fa6620ce6323cc30eec447861f3",
"sha256:32b40a4c46d199827d79c86bb8cb88b1bbb764f127876f2cb6f3a47f63dbada3",
"sha256:3cc94c69f6bd48ed86e8e24f358cb75095c8129827df1298518ab860115269a4",
"sha256:42b278ac0989d6f5cf58d7e0828ea6b5951464e3cf2ff229dd09a96cb6ba0c86",
"sha256:495b63fd0302f282ee6c1e6ea0f1c12cb3d1a49c8292d27287f01845ff252a96",
"sha256:4af87cdc0d4b14e600e6d3d09793dce3b7171348a094ba818e2a68ae7ee67546",
"sha256:4b94df9f2fdde7b9314321bab8448e6ad5a23b80542dcab53e329527d4099dcb",
"sha256:4c48ddb63e2b20fba4c6a2bf81b4d49e99b6d4587fb67a6cd33a2c1f003af3e3",
"sha256:4df9afd17bd5477e9f8c8b6bb8507e18dd0f8b4efe73bb99729ff203279e9e3b",
"sha256:518950fe6a5d56f94ba125107895f938a4f34f704c658986eae8255edb41163b",
"sha256:538298e4e113ccb8b41658d5a4b605bebe75e46a30ceca22a5a289cf02c80bec",
"sha256:55465121e72e208a7b69b53de791402affe6165083b2ea71b892728bd19ba9ae",
"sha256:588384d70a0f19b47409cfdb10e0c27c20e4293b74fc891df3d8eb47782b8b3e",
"sha256:6278c080d4afffc9016e14325f8734456831124e8c12caa754fd544435c08386",
"sha256:64ea6c221aeee4796860405e1aedec63424cda4202a7ad27a5066876db5b0fd2",
"sha256:681dbb33e2b40262b33fd383bae63c36d33fd79fa1a8e4092945430744ffd34a",
"sha256:6936aa9da390402d646a32a6a38d5409c2d2afb2950f045a7d02ab25a4e7d08d",
"sha256:778d0ec38bbd288b150a3ae363c8ffd88d2207a756842495e9bffd8a8afbc89a",
"sha256:8251f06a77985a2729a8bdbefbae79ee78567dddc3acbd499b87e705ca59fe24",
"sha256:83b4aa5344cce005a9cff5d0321b2e318e871cc1dfc793b66c32dd4f59e9770d",
"sha256:844fad925ac5c2ad4faaceb3b2520ad016b5280105c6e16e79838cf951903a7b",
"sha256:8ceb3667dd13b8133f2e4d637b5b00f240f066448e2aa89a41f4c2d78a26ce50",
"sha256:92dc0fb79675882d0b6138be4bf0cec7ea7c7eede60aaca78303d8e8dbdaa523",
"sha256:9789bd945e9f5bd026ed3f5b453d640befb8b1fc33a779c1fe8d3eb21fe3fb4a",
"sha256:a2b6d6eb693bc2fc6c484f2e5d93bd0b0da803fa77bf974f160533e555e4d095",
"sha256:aab9f1e34d810feb00bf841993552b8fcc6ae71d473c505381627143d0018a6a",
"sha256:abb61afd84f23099ac6099d804cdba9bd3b902aaaded3ffff47e490b0a495520",
"sha256:adf9ee115ae8ff8b6da4b854b4152f253b390ba64407a22d75456fe07dcbda65",
"sha256:aedc6c672b351afe6dfe17ff83ee5e7eb6ed44718f879a9328a68bdb20b57e11",
"sha256:b7a00ecb1434f8183395fac5366a21ee73d14900082ca37cf74993cf46baa56c",
"sha256:ba32f4a91c1cb7314c429b03afbf87b1fff4fb1c8db32260e7310104bd77f0c7",
"sha256:cbd0f2cbd8689861209cd89141371d3a22a11613304d1f0736492590aa0ab332",
"sha256:e4bc372b953bf6cec65a8d48482ba574f6e051621d157cf224227dbb55486b1e",
"sha256:eccac3d9aadc68e994b6d228cb0c8919fc47a5350d85a1b4d3d81d1e98baf40c",
"sha256:efd550b3da28195746bb43bd1d815058181a7ca6d9d6aa89dd37f5eefe2cacb7",
"sha256:efef581c8ba4d990770875e1a2218e856849d32ada2680e53aebc5d154a17e20",
"sha256:f057897711a630a0b7a6a03f1acf379b6ba25d37dc5dc217a97191984ba7f2fc",
"sha256:f37d45fab14ffef9d33a0dc3bc59ce0c5313e2253323312d47739192da94f5fd",
"sha256:f44906f70205d456d503105023041f1e63aece7623b31c390a0103db4de17537"
],
"version": "==5.4.0"
"version": "==5.2.0"
}
},
"develop": {}

253
README.md
View File

@ -1,52 +1,227 @@
# capsul-flask
# capsulflask
![screenshot of capsul.org home page](./docs/capsul.webp)
Python Flask web application for capsul.org
Python Flask web application implementing user accounts, payment, and virtual machine management for a smol "virtual machine (vm) as a service" aka "cloud compute" provider. Originally developed by [Cyberia Computer Club](https://cyberia.club) for https://capsul.org
`capsul-flask` integrates with [Stripe](https://stripe.com/) as a credit card processor, and [BTCPay Server](https://github.com/btcpayserver/btcpayserver-docker) as a cryptocurrency payment processor.
## how to run locally
`capsul-flask` invokes [shell-scripts](./capsulflask/shell_scripts/) to create/manage [libvirt/qemu](https://www.libvirt.org/manpages/virsh.html) vms, and it depends on `dnsmasq` to act as the DHCP server for the vms.
`capsul-flask` has a ["hub and spoke" architecture](./architecture.md). The "Hub" runs the web application and talks to the Postrges database, while the "Spoke"(s) are responsible for creating/managing virtual machines. In this way, capsul can be scaled to span more than one machine. One instance of the capsul-flask application can run in both hub mode and spoke mode at the same time, however there must only be one instance of the app running in "Hub" mode at any given time.
## Quickstart (run capsul-flask on your computer in development mode)
Ensure you have the pre-requisites for the psycopg2 Postgres database adapter package
```
# get an instance of postgres running locally on port 5432
# (you don't have to use docker, but we thought this might be the easiest for a how-to example)
docker run --rm -it -e POSTGRES_PASSWORD=dev -p 5432:5432 postgres &
sudo apt install python3-dev libpq-dev
pg_config --version
```
# install dependencies
sudo apt install pipenv python3-dev libpq-dev
Ensure you have the wonderful `pipenv` python package management and virtual environment cli
# download and run
git clone https://giit.cyberia.club/~forest/capsul-flask
cd capsul-flask
```
sudo apt install pipenv
```
Create python virtual environment and install packages
```
# install deps
pipenv install
```
Run an instance of Postgres (I used docker for this, you can use whatever you want, point is its listening on localhost:5432)
```
docker run --rm -it -e POSTGRES_PASSWORD=dev -p 5432:5432 postgres
```
Run the app
```
pipenv run flask run
```
Interested in learning more? How about a trip to the the `docs/` folder:
Run the app in gunicorn:
- [**Setting up `capsul-flask` locally**](./docs/local-set-up.md)
- [Manually](./docs/local-set-up.md#manually)
- [With docker-compose](./docs/local-set-up.md#docker_compose)
- [**Configuring `capsul-flask`**](./docs/configuration.md)
- [Example configuration from capsul.org (production)](./docs/configuration.md#example)
- [Configuration-type-stuff that lives in the database ](./docs/configuration.md#config_that_lives_in_db)
- [Loading variables from files (docker secrets)](./docs/configuration.md#docker_secrets)
- [**`capsul-flask`'s relationship to its Database Server**](./docs/database.md)
- [Database schema management (schema versions)](./docs/database.md#schema_management)
- [Running manual database queries](./docs/database.md#manual_queries)
- [**`capsul-flask`'s hub-and-spoke architecture**](./docs/architecture.md)
- [**Running the automated tests**](./docs/testing.md)
- [**Deploying capsul-flask on a server**](./docs/deployment.md)
- [Installing prerequisites for Spoke Mode](./docs/deployment.md#spoke_mode_prerequisites)
- [Deploying capsul-flask manually](./docs/deployment.md#deploy_manually)
- [Deploying capsul-flask with coop-cloud's docker-swarm configuration](./docs/deployment.md#coop_cloud_docker)
- [Deploying capsul-flask with coop-cloud's `abra` deployment tool](./docs/deployment.md#coop_cloud_abra)
- [**Accepting cryptocurrency payments with BTCPay Server**](./docs/btcpay.md)
- [Setting up the BTCPAY_PRIVATE_KEY](./docs/btcpay.md#BTCPAY_PRIVATE_KEY)
- [Testing cryptocurrency payments](./docs/btcpay.md#testing)
- [Sequence diagram explaining how BTC payment process works (how we accept 0-confirmation transactions 😀)](./docs/btcpay.md#0_conf_diagram)
```
pipenv run gunicorn --bind 127.0.0.1:5000 -k gevent --worker-connections 1000 app:app
```
Once you log in for the first time, you will want to give yourself some free capsulbux so you can create fake capsuls for testing.
Note that by default when running locally, the `SPOKE_MODEL` is set to `mock`, meaning that it won't actually try to spawn vms.
```
pipenv run flask cli sql -c "INSERT INTO payments (email, dollars) VALUES ('<your email address here>', 20.00)"
```
## configuration:
Create a `.env` file to set up the application configuration:
```
nano .env
```
You can enter any environment variables referenced in `__init__.py` to this file.
For example you may enter your SMTP credentials like this:
```
MAIL_USERNAME=forest@nullhex.com
MAIL_DEFAULT_SENDER=forest@nullhex.com
MAIL_PASSWORD=**************
```
## how to view the logs on the database server (legion.cyberia.club)
`sudo -u postgres pg_dump capsul-flask | gzip -9 > capsul-backup-2021-02-15.gz`
-----
## cli
You can manually mess around with the database like this:
```
pipenv run flask cli sql -f test.sql
```
```
pipenv run flask cli sql -c 'SELECT * FROM vms'
```
This one selects the vms table with the column name header:
```
pipenv run flask cli sql -c "SELECT string_agg(column_name::text, ', ') from information_schema.columns WHERE table_name='vms'; SELECT * from vms"
```
How to modify a payment manually, like if you get a chargeback or to fix customer payment issues:
```
$ pipenv run flask cli sql -c "SELECT id, created, email, dollars, invalidated from payments"
1, 2020-05-05T00:00:00, forest.n.johnson@gmail.com, 20.00, FALSE
$ pipenv run flask cli sql -c "UPDATE payments SET invalidated = True WHERE id = 1"
1 rows affected.
$ pipenv run flask cli sql -c "SELECT id, created, email, dollars, invalidated from payments"
1, 2020-05-05T00:00:00, forest.n.johnson@gmail.com, 20.00, TRUE
```
How you would kick off the scheduled task:
```
pipenv run flask cli cron-task
```
-----
## postgres database schema management
capsulflask has a concept of a schema version. When the application starts, it will query the database for a table named
`schemaversion` that has one row and one column (`version`). If the `version` it finds is not equal to the `desiredSchemaVersion` variable set in `db.py`, it will run migration scripts from the `schema_migrations` folder one by one until the `schemaversion` table shows the correct version.
For example, the script named `02_up_xyz.sql` should contain code that migrates the database from schema version 1 to schema version 2. Likewise, the script `02_down_xyz.sql` should contain code that migrates from schema version 2 back to schema version 1.
**IMPORTANT: if you need to make changes to the schema, make a NEW schema version. DO NOT EDIT the existing schema versions.**
In general, for safety, schema version upgrades should not delete data. Schema version downgrades will simply throw an error and exit for now.
-----
## hub-and-spoke architecture
![](readme/hub-and-spoke1.png)
This diagram was created with https://app.diagrams.net/.
To edit it, download the <a download href="readme/hub-and-spoke.xml">diagram file</a> and edit it with the https://app.diagrams.net/ web application, or you may run the application from [source](https://github.com/jgraph/drawio) if you wish.
right now I have 2 types of operations, immediate mode and async.
both types of operations do assignment synchronously. so if the system cant assign the operation to one or more hosts (spokes),
or whatever the operation requires, then it will fail.
some operations tolerate partial failures, like, `capacity_avaliable` will succeed if at least one spoke succeeds.
for immediate mode requests (like `list`, `capacity_avaliable`, `destroy`), assignment and completion of the operation are the same thing.
for async ones, they can be assigned without knowing whether or not they succeeded (`create`).
![](readme/hub-and-spoke2.png)
This diagram was created with https://app.diagrams.net/.
To edit it, download the <a download href="readme/hub-and-spoke.xml">diagram file</a> and edit it with the https://app.diagrams.net/ web application, or you may run the application from [source](https://github.com/jgraph/drawio) if you wish.
if you issue a create, and it technically could go to any number of hosts, but only one host responds, it will succeed
but if you issue a create and somehow 2 hosts both think they own that task, it will fail and throw a big error. cuz it expects exactly 1 to own the create task
currently its not set up to do any polling. its not really like a queue at all. It's all immediate for the most part
-----
## how to setup btcpay server
Generate a private key and the accompanying bitpay SIN for the btcpay API client.
I used this code as an example: https://github.com/bitpay/bitpay-python/blob/master/bitpay/key_utils.py#L6
```
$ pipenv run python ./readme/generate_btcpay_keys.py
```
It should output something looking like this:
```
-----BEGIN EC PRIVATE KEY-----
EXAMPLEIArx/EXAMPLEKH23EXAMPLEsYXEXAMPLE5qdEXAMPLEcFHoAcEXAMPLEK
oUQDQgAEnWs47PT8+ihhzyvXX6/yYMAWWODluRTR2Ix6ZY7Z+MV7v0W1maJzqeqq
NQ+cpBvPDbyrDk9+Uf/sEaRCma094g==
-----END EC PRIVATE KEY-----
EXAMPLEwzAEXAMPLEEXAMPLEURD7EXAMPLE
```
In order to register the key with the btcpay server, you have to first generate a pairing token using the btcpay server interface.
This requires your btcpay server account to have access to the capsul store. Ask Cass about this.
Navigate to `Manage store: Access Tokens` at: `https://btcpay.cyberia.club/stores/<store-id>/Tokens`
![](readme/btcpay_sin_pairing.jpg)
Finally, send an http request to the btcpay server to complete the pairing:
```
curl -H "Content-Type: application/json" https://btcpay.cyberia.club/tokens -d "{'id': 'EXAMPLEwzAEXAMPLEEXAMPLEURD7EXAMPLE', 'pairingCode': 'XXXXXXX'}"
```
It should respond with a token:
```
{"data":[{"policies":[],"pairingCode":"XXXXXXX","pairingExpiration":1589473817597,"dateCreated":1589472917597,"facade":"merchant","token":"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx","label":"capsulflask"}]}
```
And you should see the token in the btcpay server UI:
![](readme/paired.jpg)
Now simply set your `BTCPAY_PRIVATE_KEY` variable in `.env`
NOTE: make sure to use single quotes and replace the new lines with \n.
```
BTCPAY_PRIVATE_KEY='-----BEGIN EC PRIVATE KEY-----\nEXAMPLEIArx/EXAMPLEKH23EXAMPLEsYXEXAMPLE5qdEXAMPLEcFHoAcEXAMPLEK\noUQDQgAEnWs47PT8+ihhzyvXX6/yYMAWWODluRTR2Ix6ZY7Z+MV7v0W1maJzqeqq\nNQ+cpBvPDbyrDk9+Uf/sEaRCma094g==\n-----END EC PRIVATE KEY-----'
```
-----
## testing cryptocurrency payments
I used litecoin to test cryptocurrency payments, because its the simplest & lowest fee cryptocurrency that BTCPay server supports. You can download the easy-to-use litecoin SPV wallet `electrum-ltc` from [github.com/pooler/electrum-ltc](https://github.com/pooler/electrum-ltc) or [electrum-ltc.org](https://electrum-ltc.org/), set up a wallet, and then either purchase some litecoin from an exchange, or ask Forest for some litecoin to use for testing.
## sequence diagram explaining how BTC payment process works
![btcpayment_process](readme/btcpayment_process.png)
This diagram was created with https://app.diagrams.net/.
To edit it, download the <a download href="readme/btcpayment_process.drawio">diagram file</a> and edit it with the https://app.diagrams.net/ web application, or you may run the application from [source](https://github.com/jgraph/drawio) if you wish.

3
app.py
View File

@ -1,5 +1,4 @@
from capsulflask import create_app
from capsulflask.http_client import MyHTTPClient
app = create_app(lambda timeout_seconds: MyHTTPClient(timeout_seconds=timeout_seconds))
create_app()

View File

@ -2,7 +2,6 @@ import logging
from logging.config import dictConfig as logging_dict_config
import atexit
import jinja2
import os
import hashlib
import requests
@ -18,27 +17,17 @@ from flask import current_app
from apscheduler.schedulers.background import BackgroundScheduler
from capsulflask.shared import *
from capsulflask.shared import my_exec_info_message
from capsulflask import hub_model, spoke_model, cli
from capsulflask.btcpay import client as btcpay
from capsulflask.http_client import MyHTTPClient
def create_app(http_client_factory):
for var_name in [
"SPOKE_HOST_TOKEN", "HUB_TOKEN", "STRIPE_SECRET_KEY",
"BTCPAY_PRIVATE_KEY", "MAIL_PASSWORD"
]:
var = os.environ.get(f"{var_name}_FILE")
if not var:
continue
if not os.path.isfile(var):
continue
with open(var) as secret_file:
os.environ[var_name] = secret_file.read().rstrip('\n')
del os.environ[f"{var_name}_FILE"]
class StdoutMockFlaskMail:
def send(self, message: Message):
current_app.logger.info(f"Email would have been sent if configured:\n\nto: {','.join(message.recipients)}\nsubject: {message.subject}\nbody:\n\n{message.body}\n\n")
def create_app():
config = {
**dotenv_values(find_dotenv()),
@ -48,7 +37,7 @@ def create_app(http_client_factory):
app = Flask(__name__)
app.config.from_mapping(
TESTING=config.get("TESTING", "False").lower() in ['true', '1', 't', 'y', 'yes'],
TESTING=config.get("TESTING", False),
BASE_URL=config.get("BASE_URL", "http://localhost:5000"),
SECRET_KEY=config.get("SECRET_KEY", "dev"),
HUB_MODE_ENABLED=config.get("HUB_MODE_ENABLED", "True").lower() in ['true', '1', 't', 'y', 'yes'],
@ -59,7 +48,6 @@ def create_app(http_client_factory):
LOG_LEVEL=config.get("LOG_LEVEL", "INFO"),
SPOKE_HOST_ID=config.get("SPOKE_HOST_ID", "baikal"),
SPOKE_HOST_TOKEN=config.get("SPOKE_HOST_TOKEN", "changeme"),
SSH_USERNAME=os.environ.get("SSH_USERNAME", default="cyberian"),
HUB_TOKEN=config.get("HUB_TOKEN", "changeme"),
# https://www.postgresql.org/docs/9.1/libpq-ssl.html#LIBPQ-SSL-SSLMODE-STATEMENTS
@ -95,6 +83,30 @@ def create_app(http_client_factory):
app.config['HUB_URL'] = config.get("HUB_URL", app.config['BASE_URL'])
class SetLogLevelToDebugForHeartbeatRelatedMessagesFilter(logging.Filter):
def isHeartbeatRelatedString(self, thing):
# thing_string = "<error>"
is_in_string = False
try:
thing_string = "%s" % thing
is_in_string = 'heartbeat-task' in thing_string or 'hub/heartbeat' in thing_string or 'spoke/heartbeat' in thing_string
except:
pass
# self.warning("isHeartbeatRelatedString(%s): %s", thing_string, is_in_string )
return is_in_string
def filter(self, record):
if app.config['LOG_LEVEL'] == "DEBUG":
return True
if self.isHeartbeatRelatedString(record.msg):
return False
for arg in record.args:
if self.isHeartbeatRelatedString(arg):
return False
return True
logging_dict_config({
'version': 1,
'formatters': {'default': {
@ -117,11 +129,11 @@ def create_app(http_client_factory):
}
})
# mylog_critical(app, "critical")
# mylog_error(app, "error")
# mylog_warning(app, "warning")
# mylog_info(app, "info")
# mylog_debug(app, "debug")
# app.logger.critical("critical")
# app.logger.error("error")
# app.logger.warning("warning")
# app.logger.info("info")
# app.logger.debug("debug")
stripe.api_key = app.config['STRIPE_SECRET_KEY']
stripe.api_version = app.config['STRIPE_API_VERSION']
@ -129,149 +141,116 @@ def create_app(http_client_factory):
if app.config['MAIL_SERVER'] != "":
app.config['FLASK_MAIL_INSTANCE'] = Mail(app)
else:
mylog_warning(app, "No MAIL_SERVER configured. capsul will simply print emails to stdout.")
app.logger.warning("No MAIL_SERVER configured. capsul will simply print emails to stdout.")
app.config['FLASK_MAIL_INSTANCE'] = StdoutMockFlaskMail()
# allow a mock http client to be injected by the test code.
app.config['HTTP_CLIENT'] = http_client_factory(int(app.config['INTERNAL_HTTP_TIMEOUT_SECONDS']))
app.config['HTTP_CLIENT'] = MyHTTPClient(timeout_seconds=int(app.config['INTERNAL_HTTP_TIMEOUT_SECONDS']))
try:
app.config['BTCPAY_CLIENT'] = btcpay.Client(api_uri=app.config['BTCPAY_URL'], pem=app.config['BTCPAY_PRIVATE_KEY'])
except:
app.logger.warning("unable to create btcpay client. Capsul will work fine except cryptocurrency payments will not work. The error was: " + my_exec_info_message(sys.exc_info()))
# only start the scheduler and attempt to migrate the database if we are running the app.
# otherwise we are running a CLI command.
command_line = ' '.join(sys.argv)
is_running_server = (
('flask run' in command_line) or
('gunicorn' in command_line) or
('test' in command_line)
)
app.logger.info(f"is_running_server: {is_running_server}")
if app.config['HUB_MODE_ENABLED']:
if app.config['HUB_MODEL'] == "capsul-flask":
app.config['HUB_MODEL'] = hub_model.CapsulFlaskHub()
# debug mode (flask reloader) runs two copies of the app. When running in debug mode,
# we only want to start the scheduler one time.
if is_running_server and (not app.debug or config.get('WERKZEUG_RUN_MAIN') == 'true'):
scheduler = BackgroundScheduler()
heartbeat_task_url = f"{app.config['HUB_URL']}/hub/heartbeat-task"
heartbeat_task_headers = {'Authorization': f"Bearer {app.config['HUB_TOKEN']}"}
heartbeat_task = lambda: requests.post(heartbeat_task_url, headers=heartbeat_task_headers)
scheduler.add_job(name="heartbeat-task", func=heartbeat_task, trigger="interval", seconds=5)
scheduler.start()
atexit.register(lambda: scheduler.shutdown())
else:
app.config['HUB_MODEL'] = hub_model.MockHub()
from capsulflask import db
db.init_app(app, is_running_server)
from capsulflask import (
auth, landing, console, payment, metrics, cli, hub_api, publicapi, admin
)
app.register_blueprint(auth.bp)
app.register_blueprint(landing.bp)
app.register_blueprint(console.bp)
app.register_blueprint(payment.bp)
app.register_blueprint(metrics.bp)
app.register_blueprint(cli.bp)
app.register_blueprint(hub_api.bp)
app.register_blueprint(admin.bp)
app.register_blueprint(publicapi.bp)
app.add_url_rule("/", endpoint="index")
if app.config['SPOKE_MODE_ENABLED']:
if app.config['SPOKE_MODEL'] == "shell-scripts":
app.config['SPOKE_MODEL'] = spoke_model.ShellScriptSpoke()
else:
app.config['SPOKE_MODEL'] = spoke_model.MockSpoke()
from capsulflask import spoke_api
app.register_blueprint(spoke_api.bp)
@app.after_request
def security_headers(response):
response.headers['X-Frame-Options'] = 'SAMEORIGIN'
if 'Content-Security-Policy' not in response.headers:
response.headers['Content-Security-Policy'] = "default-src 'self'"
response.headers['X-Content-Type-Options'] = 'nosniff'
return response
@app.context_processor
def override_url_for():
"""
override the url_for function built into flask
with our own custom implementation that busts the cache correctly when files change
"""
return dict(url_for=url_for_with_cache_bust)
def url_for_with_cache_bust(endpoint, **values):
"""
Add a query parameter based on the hash of the file, this acts as a cache bust
"""
app.config['BTCPAY_ENABLED'] = False
if app.config['BTCPAY_URL'] != "":
try:
app.config['BTCPAY_CLIENT'] = btcpay.Client(api_uri=app.config['BTCPAY_URL'], pem=app.config['BTCPAY_PRIVATE_KEY'])
app.config['BTCPAY_ENABLED'] = True
except:
mylog_warning(app, "unable to create btcpay client. Capsul will work fine except cryptocurrency payments will not work. The error was: " + my_exec_info_message(sys.exc_info()))
# only start the scheduler and attempt to migrate the database if we are running the app.
# otherwise we are running a CLI command.
command_line = ' '.join(sys.argv)
is_running_server = (
('flask run' in command_line) or
('gunicorn' in command_line) or
('test' in command_line)
)
mylog_info(app, f"is_running_server: {is_running_server}")
if app.config['HUB_MODE_ENABLED']:
if app.config['HUB_MODEL'] == "capsul-flask":
app.config['HUB_MODEL'] = hub_model.CapsulFlaskHub()
# debug mode (flask reloader) runs two copies of the app. When running in debug mode,
# we only want to start the scheduler one time.
if is_running_server and not app.config['TESTING'] and (not app.debug or config.get('WERKZEUG_RUN_MAIN') == 'true'):
scheduler = BackgroundScheduler()
heartbeat_task_url = f"{app.config['HUB_URL']}/hub/heartbeat-task"
heartbeat_task_headers = {'Authorization': f"Bearer {app.config['HUB_TOKEN']}"}
heartbeat_task = lambda: requests.post(heartbeat_task_url, headers=heartbeat_task_headers)
scheduler.add_job(name="heartbeat-task", func=heartbeat_task, trigger="interval", seconds=5)
scheduler.start()
atexit.register(lambda: scheduler.shutdown())
else:
app.config['HUB_MODEL'] = hub_model.MockHub()
from capsulflask import db
db.init_app(app, is_running_server)
from capsulflask import auth, landing, console, payment, metrics, cli, hub_api, admin
app.register_blueprint(landing.bp)
app.register_blueprint(auth.bp)
app.register_blueprint(console.bp)
app.register_blueprint(payment.bp)
app.register_blueprint(metrics.bp)
app.register_blueprint(cli.bp)
app.register_blueprint(hub_api.bp)
app.register_blueprint(admin.bp)
app.add_url_rule("/", endpoint="index")
if app.config['SPOKE_MODE_ENABLED']:
if app.config['SPOKE_MODEL'] == "shell-scripts":
app.config['SPOKE_MODEL'] = spoke_model.ShellScriptSpoke()
else:
app.config['SPOKE_MODEL'] = spoke_model.MockSpoke()
from capsulflask import spoke_api
app.register_blueprint(spoke_api.bp)
@app.after_request
def security_headers(response):
response.headers['X-Frame-Options'] = 'SAMEORIGIN'
if 'Content-Security-Policy' not in response.headers:
response.headers['Content-Security-Policy'] = "default-src 'self'"
response.headers['X-Content-Type-Options'] = 'nosniff'
return response
@app.context_processor
def override_url_for():
"""
override the url_for function built into flask
with our own custom implementation that busts the cache correctly when files change
"""
return dict(url_for=url_for_with_cache_bust)
return app
def url_for_with_cache_bust(endpoint, **values):
"""
Add a query parameter based on the hash of the file, this acts as a cache bust
"""
if endpoint == 'static':
filename = values.get('filename', None)
if filename:
if 'STATIC_FILE_HASH_CACHE' not in current_app.config:
current_app.config['STATIC_FILE_HASH_CACHE'] = dict()
if filename not in current_app.config['STATIC_FILE_HASH_CACHE']:
filepath = os.path.join(current_app.root_path, endpoint, filename)
#print(filepath)
if os.path.isfile(filepath) and os.access(filepath, os.R_OK):
with open(filepath, 'rb') as file:
hasher = hashlib.md5()
hasher.update(file.read())
current_app.config['STATIC_FILE_HASH_CACHE'][filename] = hasher.hexdigest()[-6:]
if endpoint == 'static':
filename = values.get('filename', None)
if filename:
if 'STATIC_FILE_HASH_CACHE' not in current_app.config:
current_app.config['STATIC_FILE_HASH_CACHE'] = dict()
if filename not in current_app.config['STATIC_FILE_HASH_CACHE']:
filepath = os.path.join(current_app.root_path, endpoint, filename)
#print(filepath)
if os.path.isfile(filepath) and os.access(filepath, os.R_OK):
values['q'] = current_app.config['STATIC_FILE_HASH_CACHE'][filename]
return url_for(endpoint, **values)
class StdoutMockFlaskMail:
def send(self, message: Message):
mylog_info(current_app, f"Email would have been sent if configured:\n\nto: {','.join(message.recipients)}\nsubject: {message.subject}\nbody:\n\n{message.body}\n\n")
class SetLogLevelToDebugForHeartbeatRelatedMessagesFilter(logging.Filter):
def isHeartbeatRelatedString(self, thing):
# thing_string = "<error>"
is_in_string = False
try:
thing_string = "%s" % thing
is_in_string = 'heartbeat-task' in thing_string or 'hub/heartbeat' in thing_string or 'spoke/heartbeat' in thing_string
except:
pass
# self.warning("isHeartbeatRelatedString(%s): %s", thing_string, is_in_string )
return is_in_string
def filter(self, record):
if not current_app or current_app.config['LOG_LEVEL'] == "DEBUG":
return True
if self.isHeartbeatRelatedString(record.msg):
return False
for arg in record.args:
if self.isHeartbeatRelatedString(arg):
return False
return True
with open(filepath, 'rb') as file:
hasher = hashlib.md5()
hasher.update(file.read())
current_app.config['STATIC_FILE_HASH_CACHE'][filename] = hasher.hexdigest()[-6:]
values['q'] = current_app.config['STATIC_FILE_HASH_CACHE'][filename]
return url_for(endpoint, **values)
return app

View File

@ -10,7 +10,7 @@ from nanoid import generate
from capsulflask.metrics import durations as metric_durations
from capsulflask.auth import admin_account_required
from capsulflask.db import get_model
from capsulflask.shared import *
from capsulflask.shared import my_exec_info_message
bp = Blueprint("admin", __name__, url_prefix="/admin")
@ -58,7 +58,7 @@ def index():
)
network['allocations'].append(allocation)
else:
mylog_warning(current_app, f"/admin: capsul {vm['id']} has public_ipv4 {vm['public_ipv4']} which is out of range for its host network {host_id} {network['network_name']} {network['public_ipv4_cidr_block']}")
current_app.logger.warning(f"/admin: capsul {vm['id']} has public_ipv4 {vm['public_ipv4']} which is out of range for its host network {host_id} {network['network_name']} {network['public_ipv4_cidr_block']}")
display_hosts.append(display_host)

View File

@ -1,3 +1,4 @@
from base64 import b64decode
import functools
import re
@ -16,7 +17,6 @@ from flask_mail import Message
from werkzeug.exceptions import abort
from capsulflask.db import get_model
from capsulflask.shared import *
bp = Blueprint("auth", __name__, url_prefix="/auth")
@ -25,6 +25,15 @@ def account_required(view):
@functools.wraps(view)
def wrapped_view(**kwargs):
api_token = request.headers.get('authorization', None)
if api_token is not None:
email = get_model().authenticate_token(b64decode(api_token).decode('utf-8'))
if email is not None:
session.clear()
session["account"] = email
session["csrf-token"] = generate()
if session.get("account") is None or session.get("csrf-token") is None :
return redirect(url_for("auth.login"))
@ -56,7 +65,7 @@ def login():
if not email:
errors.append("email is required")
elif len(email.strip()) < 6 or email.count('@') != 1 or email.count('.') == 0:
errors.append("enter a valid email address")
errors.append("enter a valid email address")
if len(errors) == 0:
result = get_model().login(email)

View File

@ -12,7 +12,7 @@ from psycopg2 import ProgrammingError
from flask_mail import Message
from capsulflask.db import get_model
from capsulflask.shared import *
from capsulflask.shared import my_exec_info_message
from capsulflask.console import get_account_balance
bp = Blueprint('cli', __name__)
@ -62,37 +62,25 @@ def sql_script(f, c):
model.connection.commit()
@bp.cli.command('account-balance')
@click.option('-u', help='users email address')
@with_appcontext
def account_balance(u):
vms = get_model().list_vms_for_account(u)
payments = get_model().list_payments_for_account(u)
click.echo(".")
click.echo(".")
click.echo(get_account_balance(vms, payments, datetime.utcnow()))
click.echo(".")
@bp.cli.command('cron-task')
@with_appcontext
def cron_task():
# make sure btcpay payments get completed (in case we miss a webhook), otherwise invalidate the payment
mylog_info(current_app, "cron_task: starting clean_up_unresolved_btcpay_invoices")
current_app.logger.info("cron_task: starting clean_up_unresolved_btcpay_invoices")
clean_up_unresolved_btcpay_invoices()
mylog_info(current_app, "cron_task: finished clean_up_unresolved_btcpay_invoices")
current_app.logger.info("cron_task: finished clean_up_unresolved_btcpay_invoices")
# notify when funds run out
mylog_info(current_app, "cron_task: starting notify_users_about_account_balance")
current_app.logger.info("cron_task: starting notify_users_about_account_balance")
notify_users_about_account_balance()
mylog_info(current_app, "cron_task: finished notify_users_about_account_balance")
current_app.logger.info("cron_task: finished notify_users_about_account_balance")
# make sure vm system and DB are synced
mylog_info(current_app, "cron_task: starting ensure_vms_and_db_are_synced")
current_app.logger.info("cron_task: starting ensure_vms_and_db_are_synced")
ensure_vms_and_db_are_synced()
mylog_info(current_app, "cron_task: finished ensure_vms_and_db_are_synced")
current_app.logger.info("cron_task: finished ensure_vms_and_db_are_synced")
@ -104,7 +92,7 @@ def clean_up_unresolved_btcpay_invoices():
try:
btcpay_invoice = current_app.config['BTCPAY_CLIENT'].get_invoice(invoice_id)
except:
mylog_error(current_app, f"""
current_app.logger.error(f"""
error was thrown when contacting btcpay server for invoice {invoice_id}:
{my_exec_info_message(sys.exc_info())}"""
)
@ -113,13 +101,13 @@ def clean_up_unresolved_btcpay_invoices():
days = float((datetime.now() - unresolved_invoice['created']).total_seconds())/float(60*60*24)
if btcpay_invoice['status'] == "complete":
mylog_info(current_app,
current_app.logger.info(
f"resolving btcpay invoice {invoice_id} "
f"({unresolved_invoice['email']}, ${unresolved_invoice['dollars']}) as completed "
)
get_model().btcpay_invoice_resolved(invoice_id, True)
elif days >= 1:
mylog_info(current_app,
current_app.logger.info(
f"resolving btcpay invoice {invoice_id} "
f"({unresolved_invoice['email']}, ${unresolved_invoice['dollars']}) as invalidated, "
f"btcpay server invoice status: {btcpay_invoice['status']}"
@ -248,7 +236,7 @@ def notify_users_about_account_balance():
index_to_send = i
if index_to_send > -1:
mylog_info(current_app, f"cron_task: sending {warnings[index_to_send]['id']} warning email to {account['email']}.")
current_app.logger.info(f"cron_task: sending {warnings[index_to_send]['id']} warning email to {account['email']}.")
get_body = warnings[index_to_send]['get_body']
get_subject = warnings[index_to_send]['get_subject']
current_app.config["FLASK_MAIL_INSTANCE"].send(
@ -262,7 +250,7 @@ def notify_users_about_account_balance():
get_model().set_account_balance_warning(account['email'], warnings[index_to_send]['id'])
if index_to_send == len(warnings)-1:
for vm in vms:
mylog_warning(current_app, f"cron_task: deleting {vm['id']} ( {account['email']} ) due to negative account balance.")
current_app.logger.warning(f"cron_task: deleting {vm['id']} ( {account['email']} ) due to negative account balance.")
current_app.config["HUB_MODEL"].destroy(email=account["email"], id=vm['id'])
get_model().delete_vm(email=account["email"], id=vm['id'])
@ -294,9 +282,9 @@ def ensure_vms_and_db_are_synced():
email_addresses_raw = current_app.config['ADMIN_EMAIL_ADDRESSES'].split(",")
email_addresses = list(filter(lambda x: len(x) > 6, map(lambda x: x.strip(), email_addresses_raw ) ))
mylog_info(current_app, f"cron_task: sending inconsistency warning email to {','.join(email_addresses)}:")
current_app.logger.info(f"cron_task: sending inconsistency warning email to {','.join(email_addresses)}:")
for error in errors:
mylog_info(current_app, f"cron_task: {error}.")
current_app.logger.info(f"cron_task: {error}.")
current_app.config["FLASK_MAIL_INSTANCE"].send(
Message(

View File

@ -1,7 +1,9 @@
from base64 import b64encode
from datetime import datetime, timedelta
import json
import re
import sys
import json
from datetime import datetime, timedelta
from flask import Blueprint
from flask import flash
from flask import current_app
@ -17,7 +19,7 @@ from nanoid import generate
from capsulflask.metrics import durations as metric_durations
from capsulflask.auth import account_required
from capsulflask.db import get_model
from capsulflask.shared import *
from capsulflask.shared import my_exec_info_message
from capsulflask.payment import poll_btcpay_session
from capsulflask import cli
@ -37,7 +39,7 @@ def double_check_capsul_address(id, ipv4, get_ssh_host_keys):
if result != None and result.ssh_host_keys != None and get_ssh_host_keys:
get_model().update_vm_ssh_host_keys(email=session["account"], id=id, ssh_host_keys=result.ssh_host_keys)
except:
mylog_error(current_app, f"""
current_app.logger.error(f"""
the virtualization model threw an error in double_check_capsul_address of {id}:
{my_exec_info_message(sys.exc_info())}"""
)
@ -98,7 +100,6 @@ def index():
@bp.route("/<string:id>", methods=("GET", "POST"))
@account_required
def detail(id):
duration=request.args.get('duration')
if not duration:
duration = "5m"
@ -108,8 +109,6 @@ def detail(id):
if vm is None:
return abort(404, f"{id} doesn't exist.")
vm['ssh_username'] = current_app.config['SSH_USERNAME']
if vm['deleted']:
return render_template("capsul-detail.html", vm=vm, delete=True, deleted=True)
@ -137,7 +136,7 @@ def detail(id):
delete=True
)
else:
mylog_info(current_app, f"deleting {vm['id']} per user request ({session['account']})")
current_app.logger.info(f"deleting {vm['id']} per user request ({session['account']})")
current_app.config["HUB_MODEL"].destroy(email=session['account'], id=id)
get_model().delete_vm(email=session['account'], id=id)
@ -151,7 +150,7 @@ def detail(id):
force_stop=True,
)
else:
mylog_info(current_app, f"force stopping {vm['id']} per user request ({session['account']})")
current_app.logger.info(f"force stopping {vm['id']} per user request ({session['account']})")
current_app.config["HUB_MODEL"].vm_state_command(email=session['account'], id=id, command="force-stop")
vm["state"] = "stopped"
@ -190,11 +189,71 @@ def detail(id):
duration=duration
)
def _create(vm_sizes, operating_systems, public_keys_for_account, server_data):
errors = list()
size = server_data.get("size")
os = server_data.get("os")
posted_keys_count = int(server_data.get("ssh_authorized_key_count"))
if not size:
errors.append("Size is required")
elif size not in vm_sizes:
errors.append(f"Invalid size {size}")
if not os:
errors.append("OS is required")
elif os not in operating_systems:
errors.append(f"Invalid os {os}")
posted_keys = list()
if posted_keys_count > 1000:
errors.append("something went wrong with ssh keys")
else:
for i in range(0, posted_keys_count):
if f"ssh_key_{i}" in server_data:
posted_name = server_data.get(f"ssh_key_{i}")
key = None
for x in public_keys_for_account:
if x['name'] == posted_name:
key = x
if key:
posted_keys.append(key)
else:
errors.append(f"SSH Key \"{posted_name}\" doesn't exist")
if len(posted_keys) == 0:
errors.append("At least one SSH Public Key is required")
capacity_avaliable = current_app.config["HUB_MODEL"].capacity_avaliable(
vm_sizes[size]['memory_mb']*1024*1024
)
if not capacity_avaliable:
errors.append("""
host(s) at capacity. no capsuls can be created at this time. sorry.
""")
if len(errors) == 0:
id = make_capsul_id()
current_app.config["HUB_MODEL"].create(
email = session["account"],
id=id,
os=os,
size=size,
template_image_file_name=operating_systems[os]['template_image_file_name'],
vcpus=vm_sizes[size]['vcpus'],
memory_mb=vm_sizes[size]['memory_mb'],
ssh_authorized_keys=list(map(lambda x: x["content"], posted_keys))
)
return id, errors
return None, errors
@bp.route("/create", methods=("GET", "POST"))
@account_required
def create():
vm_sizes = get_model().vm_sizes_dict()
operating_systems = get_model().operating_systems_dict()
public_keys_for_account = get_model().list_ssh_public_keys_for_account(session["account"])
@ -202,6 +261,17 @@ def create():
capacity_avaliable = current_app.config["HUB_MODEL"].capacity_avaliable(512*1024*1024)
errors = list()
if request.method == "POST":
if "csrf-token" not in request.form or request.form['csrf-token'] != session['csrf-token']:
return abort(418, f"u want tea")
id, errors = _create(
vm_sizes,
operating_systems,
public_keys_for_account,
request.form)
if len(errors) == 0:
return redirect(f"{url_for('console.index')}?created={id}")
affordable_vm_sizes = dict()
for key, vm_size in vm_sizes.items():
# if a user deposits $7.50 and then creates an f1-s vm which costs 7.50 a month,
@ -210,74 +280,11 @@ def create():
if vm_size["dollars_per_month"] <= account_balance+0.25:
affordable_vm_sizes[key] = vm_size
if request.method == "POST":
if "csrf-token" not in request.form or request.form['csrf-token'] != session['csrf-token']:
return abort(418, f"u want tea")
size = request.form["size"]
os = request.form["os"]
if not size:
errors.append("Size is required")
elif size not in vm_sizes:
errors.append(f"Invalid size {size}")
elif size not in affordable_vm_sizes:
errors.append(f"Your account must have enough credit to run an {size} for 1 month before you will be allowed to create it")
if not os:
errors.append("OS is required")
elif os not in operating_systems:
errors.append(f"Invalid os {os}")
posted_keys_count = int(request.form["ssh_authorized_key_count"])
posted_keys = list()
if posted_keys_count > 1000:
errors.append("something went wrong with ssh keys")
else:
for i in range(0, posted_keys_count):
if f"ssh_key_{i}" in request.form:
posted_name = request.form[f"ssh_key_{i}"]
key = None
for x in public_keys_for_account:
if x['name'] == posted_name:
key = x
if key:
posted_keys.append(key)
else:
errors.append(f"SSH Key \"{posted_name}\" doesn't exist")
if len(posted_keys) == 0:
errors.append("At least one SSH Public Key is required")
capacity_avaliable = current_app.config["HUB_MODEL"].capacity_avaliable(vm_sizes[size]['memory_mb']*1024*1024)
if not capacity_avaliable:
errors.append("""
host(s) at capacity. no capsuls can be created at this time. sorry.
""")
if len(errors) == 0:
id = make_capsul_id()
# we can't create the vm record in the DB yet because its IP address needs to be allocated first.
# so it will be created when the allocation happens inside the hub_api.
current_app.config["HUB_MODEL"].create(
email = session["account"],
id=id,
os=os,
size=size,
template_image_file_name=operating_systems[os]['template_image_file_name'],
vcpus=vm_sizes[size]['vcpus'],
memory_mb=vm_sizes[size]['memory_mb'],
ssh_authorized_keys=list(map(lambda x: dict(name=x['name'], content=x['content']), posted_keys))
)
return redirect(f"{url_for('console.index')}?created={id}")
for error in errors:
flash(error)
if not capacity_avaliable:
mylog_warning(current_app, f"when capsul capacity is restored, send an email to {session['account']}")
current_app.logger.warning(f"when capsul capacity is restored, send an email to {session['account']}")
return render_template(
"create-capsul.html",
@ -292,23 +299,25 @@ def create():
vm_sizes=affordable_vm_sizes
)
@bp.route("/ssh", methods=("GET", "POST"))
@bp.route("/keys", methods=("GET", "POST"))
@account_required
def ssh_public_keys():
def ssh_api_keys():
errors = list()
token = None
if request.method == "POST":
if "csrf-token" not in request.form or request.form['csrf-token'] != session['csrf-token']:
return abort(418, f"u want tea")
method = request.form["method"]
content = None
if method == "POST":
action = request.form["action"]
if action == 'upload_ssh_key':
content = None
content = request.form["content"].replace("\r", " ").replace("\n", " ").strip()
name = request.form["name"]
if not name or len(name.strip()) < 1:
if method == "POST":
name = request.form["name"]
if not name or len(name.strip()) < 1:
parts = re.split(" +", content)
if len(parts) > 2 and len(parts[2].strip()) > 0:
name = parts[2].strip()
@ -316,10 +325,9 @@ def ssh_public_keys():
name = parts[0].strip()
else:
errors.append("Name is required")
if not re.match(r"^[0-9A-Za-z_@:. -]+$", name):
errors.append(f"Key name '{name}' must match \"^[0-9A-Za-z_@:. -]+$\"")
if not re.match(r"^[0-9A-Za-z_@:. -]+$", name):
errors.append(f"Key name '{name}' must match \"^[0-9A-Za-z_@:. -]+$\"")
if method == "POST":
if not content or len(content.strip()) < 1:
errors.append("Content is required")
else:
@ -332,31 +340,47 @@ def ssh_public_keys():
if len(errors) == 0:
get_model().create_ssh_public_key(session["account"], name, content)
elif method == "DELETE":
elif action == "delete_ssh_key":
get_model().delete_ssh_public_key(session["account"], name)
if len(errors) == 0:
get_model().delete_ssh_public_key(session["account"], name)
elif action == "generate_api_token":
name = request.form["name"]
if name == '':
name = datetime.utcnow().strftime('%y-%m-%d %H:%M:%S')
token = b64encode(
get_model().generate_api_token(session["account"], name).encode('utf-8')
).decode('utf-8')
elif action == "delete_api_token":
get_model().delete_api_token(session["account"], request.form["id"])
for error in errors:
flash(error)
keys_list=list(map(
ssh_keys_list=list(map(
lambda x: dict(name=x['name'], content=f"{x['content'][:20]}...{x['content'][len(x['content'])-20:]}"),
get_model().list_ssh_public_keys_for_account(session["account"])
))
api_tokens_list = get_model().list_api_tokens(session["account"])
return render_template(
"ssh-public-keys.html",
"keys.html",
csrf_token = session["csrf-token"],
ssh_public_keys=keys_list,
has_ssh_public_keys=len(keys_list) > 0
api_tokens=api_tokens_list,
ssh_public_keys=ssh_keys_list,
generated_api_token=token,
)
def get_vms():
return get_model().list_vms_for_account(session["account"])
if 'user_vms' not in g:
g.user_vms = get_model().list_vms_for_account(session["account"])
return g.user_vms
def get_payments():
return get_model().list_payments_for_account(session["account"])
if 'user_payments' not in g:
g.user_payments = get_model().list_payments_for_account(session["account"])
return g.user_payments
average_number_of_days_in_a_month = 30.44
@ -369,7 +393,6 @@ def get_vm_months_float(vm, as_of):
return days / average_number_of_days_in_a_month
def get_account_balance(vms, payments, as_of):
vm_cost_dollars = 0.0
for vm in vms:
vm_months = get_vm_months_float(vm, as_of)
@ -382,7 +405,6 @@ def get_account_balance(vms, payments, as_of):
@bp.route("/account-balance")
@account_required
def account_balance():
payment_sessions = get_model().list_payment_sessions_for_account(session['account'])
for payment_session in payment_sessions:
if payment_session['type'] == 'btcpay':
@ -408,7 +430,7 @@ def account_balance():
vms_billed = list()
for vm in vms:
for vm in get_vms():
vm_months = get_vm_months_float(vm, datetime.utcnow())
vms_billed.append(dict(
id=vm["id"],
@ -424,7 +446,6 @@ def account_balance():
has_vms=len(vms_billed)>0,
vms_billed=vms_billed,
warning_text=warning_text,
btcpay_enabled=current_app.config["BTCPAY_ENABLED"],
payments=list(map(
lambda x: dict(
dollars=x["dollars"],

View File

@ -8,7 +8,7 @@ from flask import current_app
from flask import g
from capsulflask.db_model import DBModel
from capsulflask.shared import *
from capsulflask.shared import my_exec_info_message
def init_app(app, is_running_server):
@ -28,12 +28,12 @@ def init_app(app, is_running_server):
schemaMigrations = {}
schemaMigrationsPath = join(app.root_path, 'schema_migrations')
mylog_info(app, "loading schema migration scripts from {}".format(schemaMigrationsPath))
app.logger.info("loading schema migration scripts from {}".format(schemaMigrationsPath))
for filename in listdir(schemaMigrationsPath):
result = re.search(r"^\d+_(up|down)", filename)
if not result:
mylog_error(app, f"schemaVersion {filename} must match ^\\d+_(up|down). exiting.")
exit(1)
app.logger.error(f"schemaVersion {filename} must match ^\\d+_(up|down). exiting.")
continue
key = result.group()
with open(join(schemaMigrationsPath, filename), 'rb') as file:
schemaMigrations[key] = file.read().decode("utf8")
@ -57,12 +57,12 @@ def init_app(app, is_running_server):
hasSchemaVersionTable = True
if hasSchemaVersionTable == False:
mylog_info(app, "no table named schemaversion found in the {} schema. running migration 01_up".format(app.config['DATABASE_SCHEMA']))
app.logger.info("no table named schemaversion found in the {} schema. running migration 01_up".format(app.config['DATABASE_SCHEMA']))
try:
cursor.execute(schemaMigrations["01_up"])
connection.commit()
except:
mylog_error(app, "unable to create the schemaversion table because: {}".format(my_exec_info_message(sys.exc_info())))
app.logger.error("unable to create the schemaversion table because: {}".format(my_exec_info_message(sys.exc_info())))
exit(1)
actionWasTaken = True
@ -70,24 +70,24 @@ def init_app(app, is_running_server):
schemaVersion = cursor.fetchall()[0][0]
if schemaVersion > desiredSchemaVersion:
mylog_critical(app, "schemaVersion ({}) > desiredSchemaVersion ({}). schema downgrades are not supported yet. exiting.".format(
app.logger.critical("schemaVersion ({}) > desiredSchemaVersion ({}). schema downgrades are not supported yet. exiting.".format(
schemaVersion, desiredSchemaVersion
))
exit(1)
while schemaVersion < desiredSchemaVersion:
migrationKey = "%02d_up" % (schemaVersion+1)
mylog_info(app, "schemaVersion ({}) < desiredSchemaVersion ({}). running migration {}".format(
app.logger.info("schemaVersion ({}) < desiredSchemaVersion ({}). running migration {}".format(
schemaVersion, desiredSchemaVersion, migrationKey
))
try:
cursor.execute(schemaMigrations[migrationKey])
connection.commit()
except KeyError:
mylog_critical(app, "missing schema migration script: {}_xyz.sql".format(migrationKey))
app.logger.critical("missing schema migration script: {}_xyz.sql".format(migrationKey))
exit(1)
except:
mylog_critical(app, "unable to execute the schema migration {} because: {}".format(migrationKey, my_exec_info_message(sys.exc_info())))
app.logger.critical("unable to execute the schema migration {} because: {}".format(migrationKey, my_exec_info_message(sys.exc_info())))
exit(1)
actionWasTaken = True
@ -96,7 +96,7 @@ def init_app(app, is_running_server):
versionFromDatabase = cursor.fetchall()[0][0]
if schemaVersion != versionFromDatabase:
mylog_critical(app, "incorrect schema version value \"{}\" after running migration {}, expected \"{}\". exiting.".format(
app.logger.critical("incorrect schema version value \"{}\" after running migration {}, expected \"{}\". exiting.".format(
versionFromDatabase,
migrationKey,
schemaVersion
@ -107,14 +107,14 @@ def init_app(app, is_running_server):
app.config['PSYCOPG2_CONNECTION_POOL'].putconn(connection)
mylog_info(app, "{} current schemaVersion: \"{}\"".format(
app.logger.info("{} current schemaVersion: \"{}\"".format(
("schema migration completed." if actionWasTaken else "schema is already up to date. "), schemaVersion
))
def get_model() -> DBModel:
def get_model():
if 'db_model' not in g:
connection = current_app.config['PSYCOPG2_CONNECTION_POOL'].getconn()
cursor = connection.cursor()
@ -128,4 +128,3 @@ def close_db(e=None):
if db_model is not None:
db_model.cursor.close()
current_app.config['PSYCOPG2_CONNECTION_POOL'].putconn(db_model.connection)

View File

@ -1,13 +1,13 @@
import re
# I was never able to get this type hinting to work correctly
# from psycopg2.extensions import connection as Psycopg2Connection, cursor as Psycopg2Cursor
import hashlib
from nanoid import generate
from flask import current_app
from typing import List
from capsulflask.shared import *
from capsulflask.shared import OnlineHost
class DBModel:
@ -17,7 +17,6 @@ class DBModel:
self.cursor = cursor
# ------ LOGIN ---------
@ -43,6 +42,16 @@ class DBModel:
self.connection.commit()
return (token, ignoreCaseMatches)
def authenticate_token(self, token):
m = hashlib.md5()
m.update(token.encode('utf-8'))
hash_token = m.hexdigest()
self.cursor.execute("SELECT email FROM api_tokens WHERE token = %s", (hash_token, ))
result = self.cursor.fetchall()
if len(result) == 1:
return result[0]
return None
def consume_token(self, token):
self.cursor.execute("SELECT email FROM login_tokens WHERE token = %s and created > (NOW() - INTERVAL '20 min')", (token, ))
@ -132,6 +141,32 @@ class DBModel:
self.cursor.execute( "DELETE FROM ssh_public_keys where email = %s AND name = %s", (email, name) )
self.connection.commit()
def list_api_tokens(self, email):
self.cursor.execute(
"SELECT id, token, name, created FROM api_tokens WHERE email = %s",
(email, )
)
return list(map(
lambda x: dict(id=x[0], token=x[1], name=x[2], created=x[3]),
self.cursor.fetchall()
))
def generate_api_token(self, email, name):
token = generate()
m = hashlib.md5()
m.update(token.encode('utf-8'))
hash_token = m.hexdigest()
self.cursor.execute(
"INSERT INTO api_tokens (email, name, token) VALUES (%s, %s, %s)",
(email, name, hash_token)
)
self.connection.commit()
return token
def delete_api_token(self, email, id_):
self.cursor.execute( "DELETE FROM api_tokens where email = %s AND id = %s", (email, id_))
self.connection.commit()
def list_vms_for_account(self, email):
self.cursor.execute("""
SELECT vms.id, vms.public_ipv4, vms.public_ipv6, vms.size, vms.os, vms.created, vms.deleted, vm_sizes.dollars_per_month
@ -270,7 +305,7 @@ class DBModel:
row = self.cursor.fetchone()
if row:
if int(row[1]) != int(dollars):
mylog_warning(current_app, f"""
current_app.logger.warning(f"""
{payment_type} gave us a completed payment session with a different dollar amount than what we had recorded!!
id: {id}
account: {row[0]}
@ -381,7 +416,7 @@ class DBModel:
def list_all_operations(self):
self.cursor.execute("""
SELECT operations.id, operations.email, operations.created, operations.payload,
host_operation.host, host_operation.assignment_status, host_operation.assigned,
host_operation.host host_operation.assignment_status, host_operation.assigned,
host_operation.completed, host_operation.results FROM operations
JOIN host_operation ON host_operation.operation = operations.id
""")
@ -479,8 +514,3 @@ class DBModel:
#cursor.close()
return to_return

View File

@ -8,7 +8,7 @@ import threading
import aiohttp
import asyncio
from flask import current_app
from capsulflask.shared import *
from capsulflask.shared import OnlineHost, my_exec_info_message
from typing import List
class HTTPResult:
@ -33,7 +33,7 @@ class MyHTTPClient:
toReturn = []
for individualResult in fromOtherThread:
if individualResult.error != None and individualResult.error != "":
mylog_error(current_app, individualResult.error)
current_app.logger.error(individualResult.error)
toReturn.append(individualResult.http_result)
return toReturn
@ -42,7 +42,7 @@ class MyHTTPClient:
future = run_coroutine(self.do_http(method=method, url=url, body=body, authorization_header=authorization_header))
fromOtherThread = future.result()
if fromOtherThread.error != None and fromOtherThread.error != "":
mylog_error(current_app, fromOtherThread.error)
current_app.logger.error(fromOtherThread.error)
return fromOtherThread.http_result
def get_client_session(self):

View File

@ -7,7 +7,7 @@ from flask import request, make_response
from werkzeug.exceptions import abort
from capsulflask.db import get_model
from capsulflask.shared import *
from capsulflask.shared import my_exec_info_message, authorized_as_hub
bp = Blueprint("hub", __name__, url_prefix="/hub")
@ -21,19 +21,19 @@ def authorized_for_host(id):
def ping_all_hosts_task():
if authorized_as_hub(request.headers):
all_hosts = get_model().get_all_hosts()
mylog_debug(current_app, f"pinging {len(all_hosts)} hosts...")
current_app.logger.debug(f"pinging {len(all_hosts)} hosts...")
authorization_header = f"Bearer {current_app.config['HUB_TOKEN']}"
results = current_app.config["HTTP_CLIENT"].do_multi_http_sync(all_hosts, "/spoke/heartbeat", None, authorization_header=authorization_header)
for i in range(len(all_hosts)):
host = all_hosts[i]
result = results[i]
mylog_debug(current_app, f"response from {host.id} ({host.url}): {result.status_code} {result.body}")
current_app.logger.debug(f"response from {host.id} ({host.url}): {result.status_code} {result.body}")
if result.status_code == 200:
get_model().host_heartbeat(host.id)
return "ok"
else:
mylog_warning(current_app, f"/hub/heartbeat-task returned 401: invalid hub token")
current_app.logger.warning(f"/hub/heartbeat-task returned 401: invalid hub token")
return abort(401, "invalid hub token")
@ -42,7 +42,7 @@ def heartbeat(host_id):
if authorized_for_host(host_id):
return "ok"
else:
mylog_warning(current_app, f"/hub/heartbeat/{host_id} returned 401: invalid token")
current_app.logger.warning(f"/hub/heartbeat/{host_id} returned 401: invalid token")
return abort(401, "invalid host id or token")
@bp.route("/claim-operation/<int:operation_id>/<string:host_id>", methods=("POST",))
@ -51,7 +51,7 @@ def claim_operation(operation_id: int, host_id: str):
payload_json = get_model().get_payload_json_from_host_operation(operation_id, host_id)
if payload_json is None:
error_message = f"{host_id} can't claim operation {operation_id} because host_operation row not found"
mylog_error(current_app, error_message)
current_app.logger.error(error_message)
return abort(404, error_message)
# right now if there is a can_claim_handler there needs to be a corresponding on_claimed_handler.
@ -84,7 +84,7 @@ def claim_operation(operation_id: int, host_id: str):
if error_message != "":
error_message = f"{host_id} can't claim operation {operation_id} because {error_message}"
mylog_error(current_app, error_message)
current_app.logger.error(error_message)
return abort(400, error_message)
# we will only return this payload as json if claiming succeeds, so might as well do this now...
@ -97,7 +97,7 @@ def claim_operation(operation_id: int, host_id: str):
if error_message != "":
error_message = f"{host_id} can't claim operation {operation_id} because {error_message}"
mylog_error(current_app, error_message)
current_app.logger.error(error_message)
return abort(400, error_message)
claimed = get_model().claim_operation(operation_id, host_id)
@ -113,7 +113,7 @@ def claim_operation(operation_id: int, host_id: str):
else:
return abort(409, f"operation was already assigned to another host")
else:
mylog_warning(current_app, f"/hub/claim-operation/{operation_id}/{host_id} returned 401: invalid token")
current_app.logger.warning(f"/hub/claim-operation/{operation_id}/{host_id} returned 401: invalid token")
return abort(401, "invalid host id or token")
def can_claim_create(payload, host_id) -> (str, str):

View File

@ -14,13 +14,9 @@ from subprocess import run
from capsulflask.db import get_model
from capsulflask.http_client import HTTPResult
from capsulflask.shared import *
from capsulflask.shared import VirtualizationInterface, VirtualMachine, OnlineHost, validate_capsul_id, my_exec_info_message
class MockHub(VirtualizationInterface):
def __init__(self):
self.default_network = "public1"
self.default_ipv4 = "1.1.1.1"
def capacity_avaliable(self, additional_ram_bytes):
return True
@ -33,36 +29,27 @@ class MockHub(VirtualizationInterface):
{"key_type":"RSA", "content":"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCvotgzgEP65JUQ8S8OoNKy1uEEPEAcFetSp7QpONe6hj4wPgyFNgVtdoWdNcU19dX3hpdse0G8OlaMUTnNVuRlbIZXuifXQ2jTtCFUA2mmJ5bF+XjGm3TXKMNGh9PN+wEPUeWd14vZL+QPUMev5LmA8cawPiU5+vVMLid93HRBj118aCJFQxLgrdP48VPfKHFRfCR6TIjg1ii3dH4acdJAvlmJ3GFB6ICT42EmBqskz2MPe0rIFxH8YohCBbAbrbWYcptHt4e48h4UdpZdYOhEdv89GrT8BF2C5cbQ5i9qVpI57bXKrj8hPZU5of48UHLSpXG8mbH0YDiOQOfKX/Mt", "sha256":"ghee6KzRnBJhND2kEUZSaouk7CD6o6z2aAc8GPkV+GQ"},
{"key_type":"ECDSA", "content":"ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLLgOoATz9R4aS2kk7vWoxX+lshK63t9+5BIHdzZeFE1o+shlcf0Wji8cN/L1+m3bi0uSETZDOAWMP3rHLJj9Hk=", "sha256":"aCYG1aD8cv/TjzJL0bi9jdabMGksdkfa7R8dCGm1yYs"}
]""")
return VirtualMachine(id, current_app.config["SPOKE_HOST_ID"], ipv4=self.default_ipv4, ssh_host_keys=ssh_host_keys)
return VirtualMachine(id, current_app.config["SPOKE_HOST_ID"], ipv4="1.1.1.1", ssh_host_keys=ssh_host_keys)
return VirtualMachine(id, current_app.config["SPOKE_HOST_ID"], ipv4=self.default_ipv4)
return VirtualMachine(id, current_app.config["SPOKE_HOST_ID"], ipv4="1.1.1.1")
def list_ids(self) -> list:
return get_model().all_non_deleted_vm_ids()
def create(self, email: str, id: str, os: str, size: str, template_image_file_name: str, vcpus: int, memory_mb: int, ssh_authorized_keys: list):
validate_capsul_id(id)
mylog_info(current_app, f"mock create: {id} for {email}")
current_app.logger.info(f"mock create: {id} for {email}")
sleep(1)
get_model().create_vm(
email=email,
id=id,
size=size,
os=os,
host=current_app.config["SPOKE_HOST_ID"],
network_name=self.default_network,
public_ipv4=self.default_ipv4,
ssh_authorized_keys=list(map(lambda x: x["name"], ssh_authorized_keys)),
)
def destroy(self, email: str, id: str):
mylog_info(current_app, f"mock destroy: {id} for {email}")
current_app.logger.info(f"mock destroy: {id} for {email}")
def vm_state_command(self, email: str, id: str, command: str):
mylog_info(current_app, f"mock {command}: {id} for {email}")
current_app.logger.info(f"mock {command}: {id} for {email}")
class CapsulFlaskHub(VirtualizationInterface):
def synchronous_operation(self, hosts: List[OnlineHost], email: str, payload: str) -> List[HTTPResult]:
return self.generic_operation(hosts, email, payload, True)[1]
@ -119,7 +106,7 @@ class CapsulFlaskHub(VirtualizationInterface):
operation_desc = ""
if operation_id:
operation_desc = f"for operation {operation_id}"
mylog_error(current_app, f"""error reading assignment_status {operation_desc} from host {host.id}:
current_app.logger.error(f"""error reading assignment_status {operation_desc} from host {host.id}:
result_is_json: {result_is_json}
result_is_dict: {result_is_dict}
result_has_status: {result_has_status}
@ -183,10 +170,10 @@ class CapsulFlaskHub(VirtualizationInterface):
except:
all_valid = False
if not all_valid:
mylog_error(current_app, f"""error reading ids for list_ids operation, host {host.id}""")
current_app.logger.error(f"""error reading ids for list_ids operation, host {host.id}""")
else:
result_json_string = json.dumps({"error_message": "invalid response, missing 'ids' list"})
mylog_error(current_app, f"""missing 'ids' list for list_ids operation, host {host.id}""")
current_app.logger.error(f"""missing 'ids' list for list_ids operation, host {host.id}""")
except:
# no need to do anything here since if it cant be parsed then generic_operation will handle it.
pass
@ -196,6 +183,7 @@ class CapsulFlaskHub(VirtualizationInterface):
def create(self, email: str, id: str, os: str, size: str, template_image_file_name: str, vcpus: int, memory_mb: int, ssh_authorized_keys: list):
validate_capsul_id(id)
online_hosts = get_model().get_online_hosts()
#current_app.logger.debug(f"hub_model.create(): ${len(online_hosts)} hosts")
payload = json.dumps(dict(
type="create",
email=email,
@ -274,3 +262,4 @@ class CapsulFlaskHub(VirtualizationInterface):
if not result_status == "success":
raise ValueError(f"""failed to {command} vm "{id}" on host "{host.id}" for {email}: {result_json_string}""")

View File

@ -3,7 +3,6 @@ from flask import render_template
from flask import current_app
from capsulflask.db import get_model
from capsulflask.shared import *
bp = Blueprint("landing", __name__, url_prefix="/")
@ -13,17 +12,15 @@ def index():
@bp.route("/pricing")
def pricing():
vm_sizes = get_model().vm_sizes_dict()
operating_systems = get_model().operating_systems_dict()
return render_template(
"pricing.html",
vm_sizes=vm_sizes,
operating_systems=operating_systems
)
@bp.route("/faq")
def faq():
return render_template("faq.html", ssh_username=current_app.config['SSH_USERNAME'])
return render_template("faq.html")
@bp.route("/about-ssh")
def about_ssh():

View File

@ -15,7 +15,6 @@ from werkzeug.exceptions import abort
from capsulflask.db import get_model
from capsulflask.auth import account_required
from capsulflask.shared import *
mutex = Lock()
bp = Blueprint("metrics", __name__, url_prefix="/metrics")
@ -144,7 +143,7 @@ def get_plot_bytes(metric, capsulid, duration, size):
def draw_plot_png_bytes(data, scale, size_x=3, size_y=1):
#mylog_info(current_app, json.dumps(data, indent=4, default=str))
#current_app.logger.info(json.dumps(data, indent=4, default=str))
pyplot.style.use("seaborn-dark")
fig, my_plot = pyplot.subplots(figsize=(size_x, size_y))

View File

@ -21,7 +21,7 @@ from werkzeug.exceptions import abort
from capsulflask.auth import account_required
from capsulflask.db import get_model
from capsulflask.shared import *
from capsulflask.shared import my_exec_info_message
bp = Blueprint("payment", __name__, url_prefix="/payment")
@ -48,10 +48,6 @@ def validate_dollars():
def btcpay_payment():
errors = list()
if not current_app.config['BTCPAY_ENABLED']:
flash("BTCPay is not enabled on this server")
return redirect(url_for("console.account_balance"))
if request.method == "POST":
result = validate_dollars()
errors = result[0]
@ -67,11 +63,11 @@ def btcpay_payment():
notificationURL=f"{current_app.config['BASE_URL']}/payment/btcpay/webhook"
))
mylog_info(current_app, f"created btcpay invoice: {invoice}")
current_app.logger.info(f"created btcpay invoice: {invoice}")
# print(invoice)
mylog_info(current_app, f"created btcpay invoice_id={invoice['id']} ( {session['account']}, ${dollars} )")
current_app.logger.info(f"created btcpay invoice_id={invoice['id']} ( {session['account']}, ${dollars} )")
get_model().create_payment_session("btcpay", invoice["id"], session["account"], dollars)
@ -89,7 +85,7 @@ def poll_btcpay_session(invoice_id):
try:
invoice = current_app.config['BTCPAY_CLIENT'].get_invoice(invoice_id)
except:
mylog_error(current_app, f"""
current_app.logger.error(f"""
error was thrown when contacting btcpay server:
{my_exec_info_message(sys.exc_info())}"""
)
@ -101,13 +97,13 @@ def poll_btcpay_session(invoice_id):
dollars = invoice['price']
mylog_info(current_app, f"poll_btcpay_session invoice_id={invoice_id}, status={invoice['status']} dollars={dollars}")
current_app.logger.info(f"poll_btcpay_session invoice_id={invoice_id}, status={invoice['status']} dollars={dollars}")
if invoice['status'] == "paid" or invoice['status'] == "confirmed" or invoice['status'] == "complete":
success_account = get_model().consume_payment_session("btcpay", invoice_id, dollars)
if success_account:
mylog_info(current_app, f"{success_account} paid ${dollars} successfully (btcpay_invoice_id={invoice_id})")
current_app.logger.info(f"{success_account} paid ${dollars} successfully (btcpay_invoice_id={invoice_id})")
if invoice['status'] == "complete":
get_model().btcpay_invoice_resolved(invoice_id, True)
@ -120,7 +116,7 @@ def poll_btcpay_session(invoice_id):
@bp.route("/btcpay/webhook", methods=("POST",))
def btcpay_webhook():
mylog_info(current_app, f"got btcpay webhook")
current_app.logger.info(f"got btcpay webhook")
# IMPORTANT! there is no signature or credential to authenticate the data sent into this webhook :facepalm:
# its just a notification, thats all.
@ -148,7 +144,7 @@ def stripe_payment():
if len(errors) == 0:
mylog_info(current_app, f"creating stripe checkout session for {session['account']}, ${dollars}")
current_app.logger.info(f"creating stripe checkout session for {session['account']}, ${dollars}")
checkout_session = stripe.checkout.Session.create(
success_url=current_app.config['BASE_URL'] + "/payment/stripe/success?session_id={CHECKOUT_SESSION_ID}",
@ -167,7 +163,7 @@ def stripe_payment():
)
stripe_checkout_session_id = checkout_session['id']
mylog_info(current_app, f"stripe_checkout_session_id={stripe_checkout_session_id} ( {session['account']}, ${dollars} )")
current_app.logger.info(f"stripe_checkout_session_id={stripe_checkout_session_id} ( {session['account']}, ${dollars} )")
get_model().create_payment_session("stripe", stripe_checkout_session_id, session["account"], dollars)
@ -251,13 +247,13 @@ def validate_stripe_checkout_session(stripe_checkout_session_id):
def success():
stripe_checkout_session_id = request.args.get('session_id')
if not stripe_checkout_session_id:
mylog_info(current_app, "/payment/stripe/success returned 400: missing required URL parameter session_id")
current_app.logger.info("/payment/stripe/success returned 400: missing required URL parameter session_id")
abort(400, "missing required URL parameter session_id")
else:
for _ in range(0, 5):
paid = validate_stripe_checkout_session(stripe_checkout_session_id)
if paid:
mylog_info(current_app, f"{paid['email']} paid ${paid['dollars']} successfully (stripe_checkout_session_id={stripe_checkout_session_id})")
current_app.logger.info(f"{paid['email']} paid ${paid['dollars']} successfully (stripe_checkout_session_id={stripe_checkout_session_id})")
return redirect(url_for("console.account_balance"))
else:
sleep(1)
@ -293,4 +289,4 @@ def success():
# except stripe.error.SignatureVerificationError:
# print("/payment/stripe/webhook returned 400: invalid signature")
# abort(400, "invalid signature")

40
capsulflask/publicapi.py Normal file
View File

@ -0,0 +1,40 @@
import datetime
from flask import Blueprint
from flask import current_app
from flask import jsonify
from flask import request
from flask import session
from nanoid import generate
from capsulflask.auth import account_required
from capsulflask.db import get_model
bp = Blueprint("publicapi", __name__, url_prefix="/api")
@bp.route("/capsul/create", methods=["POST"])
@account_required
def capsul_create():
email = session["account"]
from .console import _create,get_account_balance, get_payments, get_vms
vm_sizes = get_model().vm_sizes_dict()
operating_systems = get_model().operating_systems_dict()
public_keys_for_account = get_model().list_ssh_public_keys_for_account(session["account"])
account_balance = get_account_balance(get_vms(), get_payments(), datetime.datetime.utcnow())
capacity_avaliable = current_app.config["HUB_MODEL"].capacity_avaliable(512*1024*1024)
request.json['ssh_authorized_key_count'] = 1
id, errors = _create(
vm_sizes,
operating_systems,
public_keys_for_account,
request.json)
if id is not None:
return jsonify(
id=id,
)
return jsonify(errors=errors)

View File

@ -0,0 +1,2 @@
DROP TABLE api_keys;
UPDATE schemaversion SET version = 18;

View File

@ -1,8 +0,0 @@
DELETE FROM os_images WHERE id = 'guixsystem130';
DELETE FROM os_images WHERE id = 'archlinux';
UPDATE os_images SET deprecated = FALSE WHERE id = 'guixsystem120';
UPDATE os_images SET deprecated = FALSE WHERE id = 'centos7';
UPDATE os_images SET deprecated = FALSE WHERE id = 'centos8';
UPDATE os_images SET description = 'Ubuntu 20.04 LTS (Fossa)' WHERE id = 'ubuntu20';
UPDATE schemaversion SET version = 18;

View File

@ -0,0 +1,9 @@
CREATE TABLE api_tokens (
id SERIAL PRIMARY KEY,
email TEXT REFERENCES accounts(email) ON DELETE RESTRICT,
name TEXT NOT NULL,
created TIMESTAMP NOT NULL DEFAULT NOW(),
token TEXT NOT NULL
);
UPDATE schemaversion SET version = 19;

View File

@ -1,12 +0,0 @@
INSERT INTO os_images (id, template_image_file_name, description, deprecated)
VALUES ('guixsystem130', 'guixsystem/1.3.0/root.img.qcow2', 'Guix System 1.3.0', FALSE);
INSERT INTO os_images (id, template_image_file_name, description, deprecated)
VALUES ('archlinux', 'archlinux/root.img.qcow2', 'Arch Linux', FALSE);
UPDATE os_images SET deprecated = TRUE WHERE id = 'guixsystem120';
UPDATE os_images SET deprecated = TRUE WHERE id = 'centos7';
UPDATE os_images SET deprecated = TRUE WHERE id = 'centos8';
UPDATE os_images SET description = 'Ubuntu 20.04 (Focal)' WHERE id = 'ubuntu20';
UPDATE schemaversion SET version = 19;

View File

@ -1,8 +1,7 @@
import re
from flask import current_app, Flask
from flask import current_app
from typing import List
from threading import Lock
class OnlineHost:
def __init__(self, id: str, url: str):
@ -55,43 +54,4 @@ def authorized_as_hub(headers):
return False
def my_exec_info_message(exec_info):
return "{}: {}".format(".".join([exec_info[0].__module__, exec_info[0].__name__]), exec_info[1])
mylog_current_test_id_container = {
'value': '',
'mutex': Lock()
}
def set_mylog_test_id(test_id):
mylog_current_test_id_container['value'] = ".".join(test_id.split(".")[-2:])
def log_output_for_tests(app: Flask, message: str):
if app.config['TESTING'] and mylog_current_test_id_container['value'] != "":
mylog_current_test_id_container['mutex'].acquire()
file_object = open('unittest-log-output.log', 'a')
file_object.write(f"{mylog_current_test_id_container['value']}: {message}\n")
file_object.close()
mylog_current_test_id_container['mutex'].release()
def mylog_debug(app: Flask, message: str):
log_output_for_tests(app, f"DEBUG: {message}")
app.logger.debug(message)
def mylog_info(app: Flask, message: str):
log_output_for_tests(app, f"INFO: {message}")
app.logger.info(message)
def mylog_warning(app: Flask, message: str):
log_output_for_tests(app, f"WARNING: {message}")
app.logger.warning(message)
def mylog_error(app: Flask, message: str):
log_output_for_tests(app, f"ERROR: {message}")
app.logger.error(message)
def mylog_critical(app: Flask, message: str):
log_output_for_tests(app, f"CRITICAL: {message}")
app.logger.critical(message)
return "{}: {}".format(".".join([exec_info[0].__module__, exec_info[0].__name__]), exec_info[1])

View File

@ -30,6 +30,6 @@ if virsh domuuid "$vmname" | grep -vqE '^[\t\s\n]*$'; then
fi
# this gets the ipv4
ipv4="$(virsh domifaddr "$vmname" | awk '/ipv4/ {print $4}' | cut -d'/' -f1)"
ipv4="$(virsh domifaddr "$vmname" | awk '/vnet/ {print $4}' | cut -d'/' -f1)"
echo "$exists $state $ipv4"

View File

@ -8,7 +8,7 @@ from flask import request
from flask.json import jsonify
from werkzeug.exceptions import abort
from capsulflask.shared import *
from capsulflask.shared import my_exec_info_message, authorized_as_hub
bp = Blueprint("spoke", __name__, url_prefix="/spoke")
@ -19,18 +19,18 @@ def heartbeat():
authorization_header = f"Bearer {current_app.config['SPOKE_HOST_TOKEN']}"
result = current_app.config['HTTP_CLIENT'].do_http_sync(url, body=None, authorization_header=authorization_header)
if result.status_code == -1:
mylog_info(current_app, f"/spoke/heartbeat returned 503: hub at {url} timed out or cannot be reached")
current_app.logger.info(f"/spoke/heartbeat returned 503: hub at {url} timed out or cannot be reached")
return abort(503, "Service Unavailable: hub timed out or cannot be reached")
if result.status_code == 401:
mylog_info(current_app, f"/spoke/heartbeat returned 502: hub at {url} rejected our token")
current_app.logger.info(f"/spoke/heartbeat returned 502: hub at {url} rejected our token")
return abort(502, "hub rejected our token")
if result.status_code != 200:
mylog_info(current_app, f"/spoke/heartbeat returned 502: hub at {url} returned {result.status_code}")
current_app.logger.info(f"/spoke/heartbeat returned 502: hub at {url} returned {result.status_code}")
return abort(502, "Bad Gateway: hub did not return 200")
return "OK"
else:
mylog_info(current_app, f"/spoke/heartbeat returned 401: invalid hub token")
current_app.logger.info(f"/spoke/heartbeat returned 401: invalid hub token")
return abort(401, "invalid hub token")
@bp.route("/operation/<int:operation_id>", methods=("POST",))
@ -43,10 +43,9 @@ def operation_without_id():
def operation_impl(operation_id: int):
if authorized_as_hub(request.headers):
request_body = request.json
if not isinstance(request.json, dict) and not isinstance(request.json, list):
request_body = json.loads(request.json)
request_body_json = request.json
request_body = json.loads(request_body_json)
#current_app.logger.info(f"request.json: {request_body}")
handlers = {
"capacity_avaliable": handle_capacity_avaliable,
"get": handle_get,
@ -64,7 +63,7 @@ def operation_impl(operation_id: int):
return handlers[request_body['type']](operation_id, request_body)
except:
error_message = my_exec_info_message(sys.exc_info())
mylog_error(current_app, f"unhandled exception in {request_body['type']} handler: {error_message}")
current_app.logger.error(f"unhandled exception in {request_body['type']} handler: {error_message}")
return jsonify(dict(error_message=error_message))
else:
error_message = f"'type' must be one of {types_csv}"
@ -72,15 +71,15 @@ def operation_impl(operation_id: int):
error_message = "'type' json property is required"
if error_message != "":
mylog_error(current_app, f"/hosts/operation returned 400: {error_message}")
current_app.logger.error(f"/hosts/operation returned 400: {error_message}")
return abort(400, f"bad request; {error_message}")
else:
mylog_warning(current_app, f"/hosts/operation returned 401: invalid hub token")
current_app.logger.warning(f"/hosts/operation returned 401: invalid hub token")
return abort(401, "invalid hub token")
def handle_capacity_avaliable(operation_id, request_body):
if 'additional_ram_bytes' not in request_body:
mylog_error(current_app, f"/hosts/operation returned 400: additional_ram_bytes is required for capacity_avaliable")
current_app.logger.error(f"/hosts/operation returned 400: additional_ram_bytes is required for capacity_avaliable")
return abort(400, f"bad request; additional_ram_bytes is required for capacity_avaliable")
has_capacity = current_app.config['SPOKE_MODEL'].capacity_avaliable(request_body['additional_ram_bytes'])
@ -88,7 +87,7 @@ def handle_capacity_avaliable(operation_id, request_body):
def handle_get(operation_id, request_body):
if 'id' not in request_body:
mylog_error(current_app, f"/hosts/operation returned 400: id is required for get")
current_app.logger.error(f"/hosts/operation returned 400: id is required for get")
return abort(400, f"bad request; id is required for get")
vm = current_app.config['SPOKE_MODEL'].get(request_body['id'], request_body['get_ssh_host_keys'])
@ -102,7 +101,7 @@ def handle_list_ids(operation_id, request_body):
def handle_create(operation_id, request_body):
if not operation_id:
mylog_error(current_app, f"/hosts/operation returned 400: operation_id is required for create ")
current_app.logger.error(f"/hosts/operation returned 400: operation_id is required for create ")
return abort(400, f"bad request; operation_id is required. try POST /spoke/operation/<id>")
parameters = ["email", "id", "os", "size", "template_image_file_name", "vcpus", "memory_mb", "ssh_authorized_keys"]
@ -112,7 +111,7 @@ def handle_create(operation_id, request_body):
error_message = f"{error_message}\n{parameter} is required for create"
if error_message != "":
mylog_error(current_app, f"/hosts/operation returned 400: {error_message}")
current_app.logger.error(f"/hosts/operation returned 400: {error_message}")
return abort(400, f"bad request; {error_message}")
# only one host should create the vm, so we first race to assign this create operation to ourselves.
@ -141,7 +140,7 @@ def handle_create(operation_id, request_body):
elif result.status_code == 409:
assignment_status = "assigned_to_other_host"
else:
mylog_error(current_app, f"{url} returned {result.status_code}: {result.body}")
current_app.logger.error(f"{url} returned {result.status_code}: {result.body}")
return abort(503, f"hub did not cleanly handle our request to claim the create operation")
if assignment_status == "assigned":
@ -167,7 +166,7 @@ def handle_create(operation_id, request_body):
params= f"{params} network_name='{request_body['network_name'] if 'network_name' in request_body else 'KeyError'}', "
params= f"{params} public_ipv4='{request_body['public_ipv4'] if 'public_ipv4' in request_body else 'KeyError'}', "
mylog_error(current_app, f"spoke_model.create({params}) failed: {error_message}")
current_app.logger.error(f"spoke_model.create({params}) failed: {error_message}")
return jsonify(dict(assignment_status=assignment_status, error_message=error_message))
@ -175,11 +174,11 @@ def handle_create(operation_id, request_body):
def handle_destroy(operation_id, request_body):
if 'id' not in request_body:
mylog_error(current_app, f"/hosts/operation returned 400: id is required for destroy")
current_app.logger.error(f"/hosts/operation returned 400: id is required for destroy")
return abort(400, f"bad request; id is required for destroy")
if 'email' not in request_body:
mylog_error(current_app, f"/hosts/operation returned 400: email is required for destroy")
current_app.logger.error(f"/hosts/operation returned 400: email is required for destroy")
return abort(400, f"bad request; email is required for destroy")
try:
@ -188,7 +187,7 @@ def handle_destroy(operation_id, request_body):
error_message = my_exec_info_message(sys.exc_info())
params = f"email='{request_body['email'] if 'email' in request_body else 'KeyError'}', "
params= f"{params} id='{request_body['id'] if 'id' in request_body else 'KeyError'}', "
mylog_error(current_app, f"current_app.config['SPOKE_MODEL'].destroy({params}) failed: {error_message}")
current_app.logger.error(f"current_app.config['SPOKE_MODEL'].destroy({params}) failed: {error_message}")
return jsonify(dict(assignment_status="assigned", status="error", error_message=error_message))
return jsonify(dict(assignment_status="assigned", status="success"))
@ -199,11 +198,11 @@ def handle_vm_state_command(operation_id, request_body):
required_properties = ['id', 'email', 'command']
for required_property in required_properties:
if required_property not in request_body:
mylog_error(current_app, f"/hosts/operation returned 400: {required_property} is required for vm_state_command")
current_app.logger.error(f"/hosts/operation returned 400: {required_property} is required for vm_state_command")
return abort(400, f"bad request; {required_property} is required for vm_state_command")
if request_body['command'] not in ["stop", "force-stop", "start", "restart"]:
mylog_error(current_app, f"/hosts/operation returned 400: command ({request_body['command']}) must be one of stop, force-stop, start, or restart")
current_app.logger.error(f"/hosts/operation returned 400: command ({request_body['command']}) must be one of stop, force-stop, start, or restart")
return abort(400, f"bad request; command ({request_body['command']}) must be one of stop, force-stop, start, or restart")
try:
@ -213,7 +212,7 @@ def handle_vm_state_command(operation_id, request_body):
params = f"email='{request_body['email'] if 'email' in request_body else 'KeyError'}', "
params= f"{params} id='{request_body['id'] if 'id' in request_body else 'KeyError'}', "
params= f"{params} command='{request_body['command'] if 'command' in request_body else 'KeyError'}', "
mylog_error(current_app, f"current_app.config['SPOKE_MODEL'].vm_state_command({params}) failed: {error_message}")
current_app.logger.error(f"current_app.config['SPOKE_MODEL'].vm_state_command({params}) failed: {error_message}")
return jsonify(dict(assignment_status="assigned", status="error", error_message=error_message))
return jsonify(dict(assignment_status="assigned", status="success"))

View File

@ -10,7 +10,7 @@ from subprocess import run
from capsulflask.db import get_model
from capsulflask.shared import *
from capsulflask.shared import VirtualizationInterface, VirtualMachine, validate_capsul_id, my_exec_info_message
class MockSpoke(VirtualizationInterface):
@ -43,15 +43,15 @@ class MockSpoke(VirtualizationInterface):
def create(self, email: str, id: str, template_image_file_name: str, vcpus: int, memory_mb: int, ssh_authorized_keys: list, network_name: str, public_ipv4: str):
validate_capsul_id(id)
mylog_info(current_app, f"mock create: {id} for {email}")
current_app.logger.info(f"mock create: {id} for {email}")
self.capsuls[id] = dict(email=email, id=id, network_name=network_name, public_ipv4=public_ipv4)
sleep(1)
def destroy(self, email: str, id: str):
mylog_info(current_app, f"mock destroy: {id} for {email}")
current_app.logger.info(f"mock destroy: {id} for {email}")
def vm_state_command(self, email: str, id: str, command: str):
mylog_info(current_app, f"mock {command}: {id} for {email}")
current_app.logger.info(f"mock {command}: {id} for {email}")
class ShellScriptSpoke(VirtualizationInterface):
@ -73,7 +73,7 @@ class ShellScriptSpoke(VirtualizationInterface):
completedProcess = run(my_args, capture_output=True)
if completedProcess.returncode != 0:
mylog_error(current_app, f"""
current_app.logger.error(f"""
capacity-avaliable.sh exited {completedProcess.returncode} with
stdout:
{completedProcess.stdout}
@ -85,7 +85,7 @@ class ShellScriptSpoke(VirtualizationInterface):
lines = completedProcess.stdout.splitlines()
output = lines[len(lines)-1]
if not output == b"yes":
mylog_error(current_app, f"capacity-avaliable.sh exited 0 and returned {output} but did not return \"yes\" ")
current_app.logger.error(f"capacity-avaliable.sh exited 0 and returned {output} but did not return \"yes\" ")
return False
return True
@ -96,14 +96,14 @@ class ShellScriptSpoke(VirtualizationInterface):
self.validate_completed_process(completedProcess)
lines = completedProcess.stdout.splitlines()
if len(lines) == 0:
mylog_warning(current_app, "shell_scripts/get.sh returned zero lines!")
current_app.logger.warning("shell_scripts/get.sh returned zero lines!")
return None
result_string = lines[0].decode("utf-8")
fields = result_string.split(" ")
if fields[0] != "true":
mylog_warning(current_app, f"shell_scripts/get.sh was called for {id} which libvirt says does not exist.")
current_app.logger.warning(f"shell_scripts/get.sh was called for {id} which libvirt says does not exist.")
return None
if len(fields) < 2:
@ -126,7 +126,7 @@ class ShellScriptSpoke(VirtualizationInterface):
ssh_host_keys = json.loads(completedProcess2.stdout.decode("utf-8"))
return VirtualMachine(id, current_app.config["SPOKE_HOST_ID"], state=state, ipv4=ipaddr, ssh_host_keys=ssh_host_keys)
except:
mylog_warning(current_app, f"""
current_app.logger.warning(f"""
failed to ssh-keyscan {id} at {ipaddr}:
{my_exec_info_message(sys.exc_info())}"""
)
@ -218,5 +218,5 @@ class ShellScriptSpoke(VirtualizationInterface):
completedProcess = run([join(current_app.root_path, f"shell_scripts/{command}.sh"), id], capture_output=True)
self.validate_completed_process(completedProcess, email)
returned_string = completedProcess.stdout.decode("utf-8")
mylog_info(current_app, f"{command} vm {id} for {email} returned: {returned_string}")
current_app.logger.info(f"{command} vm {id} for {email} returned: {returned_string}")

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.2 KiB

View File

@ -241,6 +241,7 @@ thead {
background: #bdc7b812;
}
td, th {
padding: 0.1em 1em;
}
table.small td, table.small th {
@ -377,4 +378,4 @@ footer {
border: 1px solid rgba(255, 223, 155, 0.8);
box-sizing: border-box;
position: relative;
}
}

View File

@ -1,35 +0,0 @@
html {
color: #241e1e !important;
background-color: #bdc7b8 !important;
}
a {
color:#00517a !important;
text-shadow: 1px 1px 0px #eee !important;
}
a:hover, a:active, a:visited {
color: #323417 !important;
}
.nav-links a {
text-shadow: 2px 2px 0px #eee !important;
}
h1, h2, h3, h4, h5 {
text-shadow: 2px 2px 0px #eee;
}
main {
border: 1px dashed #241e1e !important;
}
input, select, textarea {
color: #241e1e !important;
}
th {
border-right: 1px solid #eee !important;
text-align: left !important;
}

View File

@ -46,9 +46,7 @@
<a href="/payment/stripe">Add funds with Credit/Debit (stripe)</a>
<ul><li>notice: stripe will load nonfree javascript </li></ul>
</li>
{% if btcpay_enabled %}
<li><a href="/payment/btcpay">Add funds with Bitcoin/Litecoin/Monero (btcpay)</a></li>
{% endif %}
<li>Cash: email <a href="mailto:treasurer@cyberia.club">treasurer@cyberia.club</a></li>
</ul>

View File

@ -31,7 +31,7 @@
{% if session["account"] %}
<a href="/console">Capsuls</a>
<a href="/console/ssh">SSH Public Keys</a>
<a href="/console/keys">SSH &amp; API Keys</a>
<a href="/console/account-balance">Account Balance</a>
{% endif %}

View File

@ -97,11 +97,11 @@
</div>
<div class="row justify-start">
<label class="align" for="ssh_username">SSH Username</label>
<span id="ssh_username">{{ vm['ssh_username'] }}</span>
<span id="ssh_username">cyberian</span>
</div>
<div class="row justify-start">
<label class="align" for="ssh_authorized_keys">SSH Authorized Keys</label>
<a id="ssh_authorized_keys" href="/console/ssh">{{ vm['ssh_authorized_keys'] }}</a>
<a id="ssh_authorized_keys" href="/console/keys">{{ vm['ssh_authorized_keys'] }}</a>
</div>
</div>

View File

@ -31,7 +31,7 @@
<p>(At least one month of funding is required)</p>
{% elif no_ssh_public_keys %}
<p>You don't have any ssh public keys yet.</p>
<p>You must <a href="/console/ssh">upload one</a> before you can create a Capsul.</p>
<p>You must <a href="/console/keys">upload one</a> before you can create a Capsul.</p>
{% elif not capacity_avaliable %}
<p>Host(s) at capacity. No capsuls can be created at this time. sorry. </p>
{% else %}

View File

@ -21,13 +21,12 @@
</li>
<li>
How do I log in?
<p>ssh to the ip provided to you using the "{{ ssh_username }}" user.</p>
<pre class='code'>$ ssh {{ ssh_username }}@1.2.3.4</pre>
<p>For more information, see <a href="/about-ssh">Understanding the Secure Shell Protocol (SSH)</a>.</p>
<p>ssh to the ip provided to you using the cyberian user.</p>
<pre class='code'>$ ssh cyberian@1.2.3.4</pre>
</li>
<li>
How do I change to the root user?
<p>The "{{ ssh_username }}" user has passwordless sudo access by default. This should work:</p>
<p>The cyberian user has passwordless sudo access by default. This should work:</p>
<pre class='code'>
# Linux
$ sudo su -

View File

@ -1,17 +1,18 @@
{% extends 'base.html' %}
{% block title %}SSH Public Keys{% endblock %}
{% block title %}SSH &amp; API Keys{% endblock %}
{% block content %}
<div class="row third-margin">
<h1>SSH PUBLIC KEYS</h1>
</div>
<div class="row third-margin"><div>
{% if has_ssh_public_keys %} <hr/> {% endif %}
{% if ssh_public_keys|length > 0 %} <hr/> {% endif %}
{% for ssh_public_key in ssh_public_keys %}
<form method="post">
<input type="hidden" name="method" value="DELETE"></input>
<input type="hidden" name="action" value="delete_ssh_key"></input>
<input type="hidden" name="name" value="{{ ssh_public_key['name'] }}"></input>
<input type="hidden" name="csrf-token" value="{{ csrf_token }}"/>
<div class="row">
@ -22,13 +23,14 @@
</form>
{% endfor %}
{% if has_ssh_public_keys %} <hr/> {% endif %}
{% if ssh_public_keys|length > 0 %} <hr/> {% endif %}
<div class="third-margin">
<h1>UPLOAD A NEW SSH PUBLIC KEY</h1>
</div>
<form method="post">
<input type="hidden" name="method" value="POST"></input>
<input type="hidden" name="action" value="upload_ssh_key"></input>
<input type="hidden" name="csrf-token" value="{{ csrf_token }}"/>
<div class="row justify-start">
<label class="align" for="content">File Contents</label>
@ -54,6 +56,51 @@
</div>
</form>
</div></div>
<hr/>
<div class="row third-margin">
<h1>API KEYS</h1>
</div>
<div class="row third-margin"><div>
{% if generated_api_token %}
<hr/>
Generated key:
<span class="code">{{ generated_api_token }}</span>
{% endif %}
{% if api_tokens|length >0 %} <hr/>{% endif %}
{% for api_token in api_tokens %}
<form method="post">
<input type="hidden" name="method" value="DELETE"></input>
<input type="hidden" name="action" value="delete_api_token"></input>
<input type="hidden" name="id" value="{{ api_token['id'] }}"></input>
<input type="hidden" name="csrf-token" value="{{ csrf_token }}"/>
<div class="row">
<span class="code">{{ api_token['name'] }}</span>
created {{ api_token['created'].strftime("%b %d %Y") }}
<input type="submit" value="Delete">
</div>
</form>
{% endfor %}
{% if api_tokens|length >0 %} <hr/>{% endif %}
<div class="third-margin">
<h1>GENERATE A NEW API KEY</h1>
</div>
<form method="post">
<input type="hidden" name="method" value="POST"></input>
<input type="hidden" name="action" value="generate_api_token"></input>
<input type="hidden" name="csrf-token" value="{{ csrf_token }}"/>
<div class="smalltext">
<p>Generate a new API key, to integrate with other systems.</p>
</div>
<div class="row justify-start">
<label class="align" for="name">Key Name</label>
<input type="text" id="name" name="name"></input> (defaults to creation time)
</div>
<div class="row justify-end">
<input type="submit" value="Generate">
</div>
</form>
</div></div>
{% endblock %}
{% block pagesource %}/templates/ssh-public-keys.html{% endblock %}

View File

@ -6,37 +6,21 @@
<div class="row third-margin">
<h1>CAPSUL TYPES & PRICING</h1>
</div>
<div class="row half-margin">
<table>
<thead>
<tr>
<th>type</th>
<th>monthly*</th>
<th>cpus</th>
<th>mem</th>
<th>ssd</th>
<th>net</th>
</tr>
</thead>
<tbody>
{% for vm_size_key, vm_size in vm_sizes.items() %}
<tr>
<td>{{ vm_size_key }}</td>
<td>${{ vm_size['dollars_per_month'] }}</td>
<td>{{ vm_size['vcpus'] }}</td>
<td>{{ vm_size['memory_mb'] }}</td>
<td>25G</td>
<td>{{ vm_size['bandwidth_gb_per_month'] }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
<div class="row half-margin">
<pre>
type monthly* cpus mem ssd net*
----- ------- ---- --- --- ---
f1-xs $5.00 1 512M 25G .5TB
f1-s $7.50 1 1024M 25G 1TB
f1-m $12.50 1 2048M 25G 2TB
f1-l $20.00 2 3072M 25G 3TB
f1-x $27.50 3 4096M 25G 4TB
f1-xx $50.00 4 8192M 25G 5TB
* net is calculated as a per-month average
* vms are billed for a minimum of 24 hours upon creation
* all VMs come standard with one public IPv4 address
SUPPORTED OPERATING SYSTEMS:

View File

@ -7,11 +7,18 @@
<h1>SUPPORT</h1>
</div>
<div class="row half-margin">
<a href="mailto:support@cyberia.club?subject=capsul%20support%20request">support@cyberia.club</a>
<a href="mailto:support@cyberia.club?subject=Please%20help!">support@cyberia.club</a>
</div>
{% endblock %}
{% block subcontent %}
<p>
Note: We maintain a searchable archive of all support emails at
<a href="https://lists.cyberia.club/~cyberia/support">https://lists.cyberia.club/~cyberia/support</a>
</p>
<p>
If you do not want your mail to appear in a public archive, email <a href="mailto:capsul@cyberia.club?subject=Please%20help!">capsul@cyberia.club</a> instead.
</p>
<p>
Please describe your problem or feature request, and we will do our best to get back to you promptly. Thank you very much.
</p>

View File

@ -12,6 +12,8 @@ class LoginTests(BaseTestCase):
response = client.get(url_for("auth.login"))
self.assert_200(response)
# FIXME test generated login link
def test_login_magiclink(self):
token, ignoreCaseMatches = get_model().login('test@example.com')

View File

@ -1,15 +1,10 @@
import logging
from copy import deepcopy
from unittest.mock import patch
from flask import url_for
from capsulflask.hub_model import MockHub
from capsulflask.db import get_model
from capsulflask.tests_base import BaseTestCase
from capsulflask.shared import *
from capsulflask.spoke_model import MockSpoke
class ConsoleTests(BaseTestCase):
@ -22,31 +17,10 @@ class ConsoleTests(BaseTestCase):
ssh_key_data = {
"name": "key2",
"method": "POST",
"action": "upload_ssh_key",
"content": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDntq1t8Ddsa2q4p+PM7W4CLYYmxakokRRVLlf7AQlsTJFPsgBe9u0zuoOaKDMkBr0dlnuLm4Eub1Mj+BrdqAokto0YDiAnxUKRuYQKuHySKK8bLkisi2k47jGBDikx/jihgiuFTawo1mYsJJepC7PPwZGsoCImJEgq1L+ug0p3Zrj3QkUx4h25MpCSs2yvfgWjDyN8hEC76O42P+4ETezYrzrd1Kj26hdzHRnrxygvIUOtfau+5ydlaz8xQBEPrEY6/+pKDuwtXg1pBL7GmoUxBXVfHQSgq5s9jIJH+G0CR0ZoHMB25Ln4X/bsCQbLOu21+IGYKSDVM5TIMLtkKUkERQMVWvnpOp1LZKir4dC0m7SW74wpA8+2b1IsURIr9ARYGJpCEv1Q1Wz/X3yTf6Mfey7992MjUc9HcgjgU01/+kYomoXHprzolk+22Gjfgo3a4dRIoTY82GO8kkUKiaWHvDkkVURCY5dpteLA05sk3Z9aRMYsNXPLeOOPfzTlDA0="
}
def setUp(self):
super().setUp()
get_model().cursor.execute("DELETE FROM host_operation")
get_model().cursor.execute("DELETE FROM operations")
get_model().cursor.execute("DELETE FROM vm_ssh_host_key")
get_model().cursor.execute("DELETE FROM vm_ssh_authorized_key")
get_model().cursor.execute("DELETE FROM ssh_public_keys")
get_model().cursor.execute("DELETE FROM login_tokens")
get_model().cursor.execute("DELETE FROM vms")
get_model().cursor.execute("DELETE FROM payments")
get_model().cursor.connection.commit()
self._login('test@example.com')
get_model().create_ssh_public_key('test@example.com', 'key', 'foo')
# heartbeat all the spokes so that the hub <--> spoke communication can work as normal.
host_ids = get_model().list_hosts_with_networks(None).keys()
for host_id in host_ids:
get_model().host_heartbeat(host_id)
def test_index(self):
self._login('test@example.com')
with self.client as client:
@ -59,58 +33,35 @@ class ConsoleTests(BaseTestCase):
response = client.get(url_for("console.create"))
self.assert_200(response)
def test_create_fails_credit(self):
with self.client as client:
client.get(url_for("console.create"))
csrf_token = self.get_context_variable('csrf_token')
data = self.capsul_data
data['csrf-token'] = csrf_token
client.post(url_for("console.create"), data=data)
capacity_message = \
'Your account must have enough credit to run an f1-xs for 1 month before you will be allowed to create it'
self.assert_message_flashed(capacity_message, category='message')
self.assertEqual(
len(get_model().list_vms_for_account('test@example.com')),
0
)
def test_create_fails_capacity(self):
with self.client as client:
client.get(url_for("console.create"))
csrf_token = self.get_context_variable('csrf_token')
data = self.capsul_data
data['csrf-token'] = csrf_token
get_model().create_payment_session('fake', 'test', 'test@example.com', 20)
get_model().consume_payment_session('fake', 'test', 20)
with patch.object(MockSpoke, 'capacity_avaliable', return_value=False) as mock_method:
response = client.post(url_for("console.create"), data=data)
# Override MockHub.capacity_avaliable to always return False
with patch.object(MockHub, 'capacity_avaliable', return_value=False) as mock_method:
self.app.config["HUB_MODEL"] = MockHub()
self.assert200(response)
client.post(url_for("console.create"), data=data)
capacity_message = \
'\n host(s) at capacity. no capsuls can be created at this time. sorry. \n '
self.assert_message_flashed(capacity_message, category='message')
mock_method.assert_called()
capacity_message = \
'\n host(s) at capacity. no capsuls can be created at this time. sorry. \n '
self.assert_message_flashed(capacity_message, category='message')
self.assertEqual(
len(get_model().list_vms_for_account('test@example.com')),
0
)
self.assertEqual(
len(get_model().list_vms_for_account('test@example.com')),
0
)
mock_method.assert_called_with(512 * 1024 * 1024)
def test_create_fails_invalid(self):
with self.client as client:
client.get(url_for("console.create"))
csrf_token = self.get_context_variable('csrf_token')
data = deepcopy(self.capsul_data)
data = self.capsul_data
data['csrf-token'] = csrf_token
data['os'] = ''
client.post(url_for("console.create"), data=data)
@ -130,46 +81,36 @@ class ConsoleTests(BaseTestCase):
client.get(url_for("console.create"))
csrf_token = self.get_context_variable('csrf_token')
data = deepcopy(self.capsul_data)
data = self.capsul_data
data['csrf-token'] = csrf_token
get_model().create_payment_session('fake', 'test', 'test@example.com', 20)
get_model().consume_payment_session('fake', 'test', 20)
response = client.post(url_for("console.create"), data=data)
# mylog_info(self.app, get_model().list_all_operations())
self.assertEqual(
len(get_model().list_all_operations()),
1
)
vms = get_model().list_vms_for_account('test@example.com')
self.assertEqual(
len(vms),
1
)
vm_id = vms[0]['id']
self.assertRedirects(
response,
url_for("console.index") + f'?created={vm_id}'
)
# FIXME: mock create doesn't create, see #83
# vms = get_model().list_vms_for_account('test@example.com')
# self.assertEqual(
# len(vms),
# 1
# )
#
# vm_id = vms[0].id
#
# self.assertRedirects(
# response,
# url_for("console.index") + f'?{vm_id}'
# )
def test_keys_loads(self):
self._login('test@example.com')
with self.client as client:
response = client.get(url_for("console.ssh_public_keys"))
response = client.get(url_for("console.ssh_api_keys"))
self.assert_200(response)
keys = self.get_context_variable('ssh_public_keys')
self.assertEqual(keys[0]['name'], 'key')
def test_keys_add_fails_invalid(self):
def test_keys_add_ssh_fails_invalid(self):
self._login('test@example.com')
with self.client as client:
client.get(url_for("console.ssh_public_keys"))
client.get(url_for("console.ssh_api_keys"))
csrf_token = self.get_context_variable('csrf_token')
data = self.ssh_key_data
@ -178,7 +119,7 @@ class ConsoleTests(BaseTestCase):
data_invalid_content = data
data_invalid_content['content'] = 'foo'
client.post(
url_for("console.ssh_public_keys"),
url_for("console.ssh_api_keys"),
data=data_invalid_content
)
@ -189,24 +130,41 @@ class ConsoleTests(BaseTestCase):
data_missing_content = data
data_missing_content['content'] = ''
client.post(url_for("console.ssh_public_keys"), data=data_missing_content)
client.post(url_for("console.ssh_api_keys"), data=data_missing_content)
self.assert_message_flashed(
'Content is required', category='message'
)
def test_keys_add_fails_duplicate(self):
def test_keys_add_ssh_fails_duplicate(self):
self._login('test@example.com')
with self.client as client:
client.get(url_for("console.ssh_public_keys"))
client.get(url_for("console.ssh_api_keys"))
csrf_token = self.get_context_variable('csrf_token')
data = self.ssh_key_data
data['csrf-token'] = csrf_token
data['name'] = 'key'
client.post(url_for("console.ssh_public_keys"), data=data)
client.post(url_for("console.ssh_api_keys"), data=data)
self.assert_message_flashed(
'A key with that name already exists',
category='message'
)
data = self.ssh_key_data
data['csrf-token'] = csrf_token
data['name'] = 'key'
client.post(url_for("console.ssh_api_keys"), data=data)
self.assert_message_flashed(
'A key with that name already exists',
category='message'
)
def setUp(self):
self._login('test@example.com')
get_model().create_ssh_public_key('test@example.com', 'key', 'foo')
def tearDown(self):
get_model().delete_ssh_public_key('test@example.com', 'key')

View File

@ -0,0 +1,45 @@
from base64 import b64encode
import requests
from flask import url_for
from capsulflask.db import get_model
from capsulflask.tests_base import BaseLiveServerTestCase
class PublicAPITests(BaseLiveServerTestCase):
def test_server_is_up_and_running(self):
response = requests.get(self.get_server_url())
self.assertEqual(response.status_code, 200)
def test_capsul_create_succeeds(self):
response = requests.post(
self.get_server_url() +
url_for('publicapi.capsul_create'),
headers={'Authorization': self.token},
json={
'size': 'f1-xs',
'os': 'openbsd68',
'ssh_key_0': 'key'
}
)
self.assertEqual(response.status_code, 200)
# FIXME: mock create doesn't create, see #83
# vms = get_model().list_vms_for_account('test@example.com')
#
# self.assertEqual(
# len(vms),
# 1
# )
def setUp(self):
get_model().create_ssh_public_key('test@example.com', 'key', 'foo')
self.token = b64encode(
get_model().generate_api_token('test@example.com', 'apikey').encode('utf-8')
).decode('utf-8')
def tearDown(self):
get_model().delete_ssh_public_key('test@example.com', 'key')
get_model().delete_api_token('test@example.com', 1)

View File

@ -1,119 +1,35 @@
import logging
import unittest
import os
import sys
import json
import itertools
import time
import threading
import asyncio
import traceback
from urllib.parse import urlparse
from typing import List
from nanoid import generate
from concurrent.futures import ThreadPoolExecutor
from flask_testing import TestCase
from flask import current_app
from flask_testing import TestCase, LiveServerTestCase
from capsulflask import create_app
from capsulflask.db import get_model
from capsulflask.http_client import *
from capsulflask.shared import *
class BaseTestCase(TestCase):
class BaseSharedTestCase(object):
def create_app(self):
# Use default connection paramaters
os.environ['POSTGRES_CONNECTION_PARAMETERS'] = "host=localhost port=5432 user=postgres password=dev dbname=capsulflask_test"
os.environ['TESTING'] = 'True'
os.environ['LOG_LEVEL'] = 'DEBUG'
os.environ['TESTING'] = '1'
os.environ['SPOKE_MODEL'] = 'mock'
os.environ['HUB_MODEL'] = 'capsul-flask'
self1 = self
get_app = lambda: self1.app
self.app = create_app(lambda timeout_seconds: TestHTTPClient(get_app, timeout_seconds))
return self.app
os.environ['HUB_MODEL'] = 'mock'
return create_app()
def setUp(self):
set_mylog_test_id(self.id())
pass
def tearDown(self):
set_mylog_test_id("")
pass
class BaseTestCase(BaseSharedTestCase, TestCase):
def _login(self, user_email):
get_model().login(user_email)
with self.client.session_transaction() as session:
session['account'] = user_email
session['csrf-token'] = generate()
class TestHTTPClient:
def __init__(self, get_app, timeout_seconds = 5):
self.timeout_seconds = timeout_seconds
self.get_app = get_app
self.executor = ThreadPoolExecutor()
def do_multi_http_sync(self, online_hosts: List[OnlineHost], url_suffix: str, body: str, authorization_header=None) -> List[HTTPResult]:
future = run_coroutine(self.do_multi_http(online_hosts=online_hosts, url_suffix=url_suffix, body=body, authorization_header=authorization_header))
fromOtherThread = future.result()
toReturn = []
for individualResult in fromOtherThread:
if individualResult.error != None and individualResult.error != "":
mylog_error(self.get_app(), individualResult.error)
toReturn.append(individualResult.http_result)
return toReturn
def do_http_sync(self, url: str, body: str, method="POST", authorization_header=None) -> HTTPResult:
future = run_coroutine(self.do_http(method=method, url=url, body=body, authorization_header=authorization_header))
fromOtherThread = future.result()
if fromOtherThread.error != None and fromOtherThread.error != "":
mylog_error(self.get_app(), fromOtherThread.error)
return fromOtherThread.http_result
async def do_http(self, url: str, body: str, method="POST", authorization_header=None) -> InterThreadResult:
path = urlparse(url).path
headers = {}
if authorization_header != None:
headers['Authorization'] = authorization_header
if body:
headers['Content-Type'] = "application/json"
#mylog_info(self.get_app(), f"path, data=body, headers=headers: {path}, {body}, {headers}")
do_request = None
if method == "POST":
do_request = lambda: self.get_app().test_client().post(path, data=body, headers=headers)
if method == "GET":
do_request = lambda: self.get_app().test_client().get(path, headers=headers)
response = None
try:
response = await get_event_loop().run_in_executor(self.executor, do_request)
except:
traceback.print_exc()
error_message = my_exec_info_message(sys.exc_info())
response_body = json.dumps({"error_message": f"do_http (HTTP {method} {url}) {error_message}"})
return InterThreadResult(
HTTPResult(-1, response_body),
f"""do_http (HTTP {method} {url}) failed with: {error_message}"""
)
return InterThreadResult(HTTPResult(response.status_code, response.get_data()), None)
async def do_multi_http(self, online_hosts: List[OnlineHost], url_suffix: str, body: str, authorization_header=None) -> List[InterThreadResult]:
tasks = []
# append to tasks in the same order as online_hosts
for host in online_hosts:
tasks.append(
self.do_http(url=f"{host.url}{url_suffix}", body=body, authorization_header=authorization_header)
)
# gather is like Promise.all from javascript, it returns a future which resolves to an array of results
# in the same order as the tasks that we passed in -- which were in the same order as online_hosts
results = await asyncio.gather(*tasks)
return results
class BaseLiveServerTestCase(BaseSharedTestCase, LiveServerTestCase):
pass

View File

@ -1,94 +0,0 @@
{% extends 'base.html' %}
{% block title %}Account Balance{% endblock %}
{% block content %}
<div class="row third-margin">
<h1>Account Balance: ${{ account_balance }}</h1>
</div>
<div class="half-margin">
{% if has_vms and has_payments and warning_text != "" %}
<div class="row">
<pre class="wrap">{{ warning_text }}</pre>
</div>
{% endif %}
<div class="row">
{% if has_payments %}
<div>
<div class="row third-margin">
<h1>Payments</h1>
</div>
<table>
<thead>
<tr>
<th>amount</th>
<th>date</th>
</tr>
</thead>
<tbody>
{% for payment in payments %}
<tr>
<td class="{{ payment['class_name'] }}">${{ payment["dollars"] }}</td>
<td class="{{ payment['class_name'] }}">{{ payment["created"] }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% endif %}
<ul>
<li>
<h1>PAYMENT OPTIONS</h1>
<ul>
<li>
<a href="/payment/stripe">Add funds with Credit/Debit (stripe)</a>
<ul><li>notice: stripe will load nonfree javascript </li></ul>
</li>
{% if btcpay_enabled %}
<li><a href="/payment/btcpay">Add funds with Bitcoin/Litecoin/Monero (btcpay)</a></li>
{% endif %}
</ul>
</li>
</ul>
</div>
{% if has_vms %}
<div class="row third-margin">
<h1>Capsuls Billed</h1>
</div>
<div class="row">
<table class="small">
<thead>
<tr>
<th>id</th>
<th>created</th>
<th>deleted</th>
<th>$/month</th>
<th>months</th>
<th>$ billed</th>
</tr>
</thead>
<tbody>
{% for vm in vms_billed %}
<tr>
<td>{{ vm["id"] }}</td>
<td>{{ vm["created"] }}</td>
<td>{{ vm["deleted"] }}</td>
<td>${{ vm["dollars_per_month"] }}</td>
<td>{{ vm["months"] }}</td>
<td>${{ vm["dollars"] }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% endif %}
</div>
{% endblock %}
{% block pagesource %}/templates/create-capsul.html{% endblock %}

View File

@ -1,60 +0,0 @@
<html lang="en">
<head>
<!-- Namecoin Address: N2aVL6pHtBp7EtNGb3jpsL2L2NyjBNbiB1 -->
<link href="{{ url_for('static', filename='favicon.yolocolo.ico') }}" rel="icon">
<title>{% block title %}{% endblock %}{% if self.title() %} - {% endif %}Capsul</title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width,initial-scale=1.0">
<meta name="Description" content="Cyberia Capsul">
{% block head %}{% endblock %}
<link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
<link rel="stylesheet" href="{{ url_for('static', filename='style.yolocolo.css') }}">
</head>
<body>
<nav>
<div class="row justify-space-between half-margin">
<div>
🦉 <a href="/"><b>YOLOCOLO</b></a>
</div>
<div>
&nbsp;
{% if session["account"] %}
{ {{ session["account"] }} <a href="{{ url_for('auth.logout') }}">Log Out</a> }
{% else %}
<a href="{{ url_for('auth.login') }}">Login</a>
{% endif %}
</div>
</div>
<div class="row justify-center half-margin wrap nav-links">
<a href="/pricing">Pricing</a>
<a href="/faq">FAQ</a>
{% if session["account"] %}
<a href="/console">Capsuls</a>
<a href="/console/ssh">SSH Public Keys</a>
<a href="/console/account-balance">Account Balance</a>
{% endif %}
<a href="/support">Support</a>
</div>
</nav>
{% for message in get_flashed_messages() %}
<div class="flash">{{ message }}</div>
{% endfor %}
{% block custom_flash %}{% endblock %}
<main>
{% block content %}{% endblock %}
</main>
{% block subcontent %}{% endblock %}
<footer>
This server runs <a
href="https://giit.cyberia.club/~forest/capsul-flask">capsul-flask</a> by
Cyberia Computer Club, available under the <a
href="https://creativecommons.org/licenses/by-sa/4.0/">Attribution-ShareAlike
4.0 International</a> licence.<br/><br/>
<a href="https://git.autonomic.zone/3wordchant/capsul-flask/src/branch/yolocolo/capsulflask{% block pagesource %}{% endblock %}">View page source</a>
</footer>
</body>
</html>

View File

@ -1,68 +0,0 @@
{% extends 'base.html' %}
{% block title %}Capsuls{% endblock %}
{% block custom_flash %}
{% if created %}
<div class="flash green">{{ created }} successfully created!</div>
{% endif %}
{% endblock %}
{% block content %}
<div class="row third-margin">
<h1>Capsuls</h1>
</div>
<div class="third-margin">
{% if has_vms %}
<div class="row third-margin justify-end">
<a href="/console/create">Create Capsul</a>
</div>
<div class="row">
<table>
<thead>
<tr>
<th class="heart-icon"></th>
<th>id</th>
<th>size</th>
<th>cpu</th>
<th>mem</th>
<th>ipv4</th>
<th>os</th>
<th>created</th>
</tr>
</thead>
<tbody>
{% for vm in vms %}
<tr>
{% if vm['state'] == 'starting' or vm['state'] == 'stopping' %}
<td class="capsul-status waiting-pulse"></td>
{% elif vm['state'] == 'crashed' or vm['state'] == 'blocked' or vm['state'] == 'stopped' %}
<td class="capsul-status red"></td>
{% elif vm['state'] == 'unknown' %}
<td class="capsul-status-questionmark">?</td>
{% else %}
<td class="capsul-status green"></td>
{% endif %}
<td><a class="no-shadow" href="/console/{{ vm['id'] }}">{{ vm["id"] }}</a></td>
<td>{{ vm["size"] }}</td>
<td class="metrics"><img src="/metrics/cpu/{{ vm['id'] }}/5m/s"/></td>
<td class="metrics"><img src="/metrics/memory/{{ vm['id'] }}/5m/s"/></td>
<td class="{{ vm['ipv4_status'] }}">{{ vm["ipv4"] }}</td>
<td>{{ vm["os"] }}</td>
<td>{{ vm["created"] }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="row">
<p>You don't have any Capsuls running. <a href="/console/create">Create one</a> today!</p>
</div>
{% endif %}
</div>
{% endblock %}
{% block pagesource %}/templates/capsuls.html{% endblock %}

View File

@ -1,46 +0,0 @@
{% extends 'base.html' %}
{% block title %}FAQ{% endblock %}
{% block content %}
<div class="row full-margin"><h1>Frequently Asked Questions</h1></div>
{% endblock %}
{% block subcontent %}
<p>
<ul>
<li>
What is this?
<p>
This is a <strong>technical demo</strong> of <a
href="https://giit.cyberia.club/~forest/capsul-flask">Capsul</a>, for the
as-yet-untitled <a href="https://coops.tech">Cotech</a> server hosting
initiative, which you can <a
href="https://community.coops.tech/t/call-for-input-v2-co-op-vps-survey/2802/9">read
about on the Cotech forum</a>.
</p>
</li>
<li>
What do you mean, "technical demo"?
<p>No backups</p>
<p>No service level agreement</p>
<p>"Best effort" support</p>
</li>
<li>
Where can I get this, but, more reliable?
<p>Cyberia, the authors of this platform, run the canonical instance, <a
href="https://capsul.org">Capsul.org</a>, on hardware they own. Please
send them your money! (cash, crypto, or card accepted).</p>
</li>
<li>
How do I use this system?
<p>Please see <a href="https://capsul.org/faq">the official Capsul FAQ
page</a>.</p>
</li>
</ul>
</p>
{% endblock %}
{% block pagesource %}/templates/faq.html{% endblock %}

View File

@ -1,28 +0,0 @@
{% extends 'base.html' %}
{% block content %}
<h1>
<pre>
_ _
_ _ ___ | | ___ ___ ___ | | ___
| | | |/ _ \| |/ _ \ / __/ _ \| |/ _ \
| |_| | (_) | | (_) | (_| (_) | | (_) |
\__, |\___/|_|\___/ \___\___/|_|\___/
|___/
</pre>
<span>Co-operative hosting using <a href="https://cyberia.club">Cyberia</a>'s Capsul</span>
{% endblock %}
{% block subcontent %}
<p>
<ul>
<li>Sign up for an account!</li>
<li>Add some funds!</li>
<li>Create a VPS!</li>
<li>Give your feedback!</li>
</ul>
</p>
{% endblock %}
{% block pagesource %}/templates/index.html{% endblock %}

View File

@ -1,23 +0,0 @@
{% extends 'base.html' %}
{% block title %}Pricing{% endblock %}
{% block content %}
<div class="row third-margin">
<h1>CAPSUL TYPES & PRICING</h1>
</div>
<div class="row half-margin">
<p>
Rates for this service aren't set yet. You can see Cyberia's Capsul pricing
on <a href="https://capsul.org/pricing">their website</a>.
</p>
</div>
<div>
<pre>
SUPPORTED OPERATING SYSTEMS:
{% for os_id, os in operating_systems.items() %} - {{ os.description }}
{% endfor %}
</pre>
</div>
{% endblock %}

View File

@ -1,21 +0,0 @@
{% extends 'base.html' %}
{% block title %}Support{% endblock %}
{% block content %}
<div class="row half-margin">
<h1>SUPPORT</h1>
</div>
<div class="row half-margin">
<a href="mailto:yolocolo@doesthisthing.work?subject=Please%20help!">yolocolo@doesthisthing.work</a>
</div>
{% endblock %}
{% block subcontent %}
<p>
You can also find us on Matrix: <a
href="https://matrix.to/#/#untitled-hosting.public:autonomic.zone">#untitled-hosting.public:autonomic.zone</a>.
</p>
{% endblock %}
{% block pagesource %}/templates/support.html{% endblock %}

View File

@ -1,36 +0,0 @@
---
version: "3.8"
services:
app:
image: 3wordchant/capsul-flask:latest
build: .
volumes:
- "./:/app/code"
- "../tank:/tank"
# - "/var/run/libvirt/libvirt-sock:/var/run/libvirt/libvirt-sock"
depends_on:
- db
ports:
- "5000:5000"
environment:
- "POSTGRES_CONNECTION_PARAMETERS=host=db port=5432 user=capsul password=capsul dbname=capsul"
- SPOKE_MODEL
- FLASK_DEBUG
- BASE_URL=http://localhost:5000
- ADMIN_PANEL_ALLOW_EMAIL_ADDRESSES=3wc.capsul@doesthisthing.work
- VIRSH_DEFAULT_CONNECT_URI=qemu:///system
# The image uses gunicorn by default, let's override it with Flask's
# built-in development server
command: ["flask", "run", "-h", "0.0.0.0", "-p", "5000"]
db:
image: "postgres:9.6.5-alpine"
volumes:
- "postgres:/var/lib/postgresql/data"
environment:
POSTGRES_USER: capsul
POSTGRES_PASSWORD: capsul
POSTGRES_DB: capsul
volumes:
postgres:

View File

@ -1,30 +0,0 @@
# hub-and-spoke architecture
The "Hub" runs the web application and talks to the Postrges database, while the "Spoke"s are responsible for creating/managing virtual machines. One instance of the capsul-flask application can run in hub mode and spoke mode at the same time.
The Hub and the Spoke must be configured to communicate securely with each-other over HTTPS. They both have to be able to dial each-other directly. The URLs / auth tokens they use are configured both in the config file (`HUB_URL`, `SPOKE_HOST_ID`, `SPOKE_HOST_TOKEN` and `HUB_TOKEN`) and in the database (the `id`, `https_url`, and `token` columns in the `hosts` table).
![](images/hub-and-spoke1.png)
This diagram was created with https://app.diagrams.net/.
To edit it, download the <a download href="readme/hub-and-spoke.xml">diagram file</a> and edit it with the https://app.diagrams.net/ web application, or you may run the application from [source](https://github.com/jgraph/drawio) if you wish.
right now I have 2 types of operations, immediate mode and async.
both types of operations do assignment synchronously. so if the system cant assign the operation to one or more hosts (spokes),
or whatever the operation requires, then it will fail.
some operations tolerate partial failures, like, `capacity_avaliable` will succeed if at least one spoke succeeds.
for immediate mode requests (like `list`, `capacity_avaliable`, `destroy`), assignment and completion of the operation are the same thing.
for async ones, they can be assigned without knowing whether or not they succeeded (`create`).
![](images/hub-and-spoke2.png)
This diagram was created with https://app.diagrams.net/.
To edit it, download the <a download href="readme/hub-and-spoke.xml">diagram file</a> and edit it with the https://app.diagrams.net/ web application, or you may run the application from [source](https://github.com/jgraph/drawio) if you wish.
if you issue a create, and it technically could go to any number of hosts, but only one host responds, it will succeed
but if you issue a create and somehow 2 hosts both think they own that task, it will fail and throw a big error. cuz it expects exactly 1 to own the create task
currently its not set up to do any polling. its not really like a queue at all. It's all immediate for the most part

View File

@ -1,68 +0,0 @@
## <a name="BTCPAY_PRIVATE_KEY"></a>Setting up the BTCPAY_PRIVATE_KEY
Generate a private key and the accompanying bitpay SIN for the btcpay API client.
I used this code as an example: https://github.com/bitpay/bitpay-python/blob/master/bitpay/key_utils.py#L6
```
$ pipenv run python ./readme/generate_btcpay_keys.py
```
It should output something looking like this:
```
-----BEGIN EC PRIVATE KEY-----
EXAMPLEIArx/EXAMPLEKH23EXAMPLEsYXEXAMPLE5qdEXAMPLEcFHoAcEXAMPLEK
oUQDQgAEnWs47PT8+ihhzyvXX6/yYMAWWODluRTR2Ix6ZY7Z+MV7v0W1maJzqeqq
NQ+cpBvPDbyrDk9+Uf/sEaRCma094g==
-----END EC PRIVATE KEY-----
EXAMPLEwzAEXAMPLEEXAMPLEURD7EXAMPLE
```
In order to register the key with the btcpay server, you have to first generate a pairing token using the btcpay server interface.
This requires your btcpay server account to have access to the capsul store. Ask Cass about this.
Navigate to `Manage store: Access Tokens` at: `https://btcpay.cyberia.club/stores/<store-id>/Tokens`
![](images/btcpay_sin_pairing.jpg)
Finally, send an http request to the btcpay server to complete the pairing:
```
curl -H "Content-Type: application/json" https://btcpay.cyberia.club/tokens -d "{'id': 'EXAMPLEwzAEXAMPLEEXAMPLEURD7EXAMPLE', 'pairingCode': 'XXXXXXX'}"
```
It should respond with a token:
```
{"data":[{"policies":[],"pairingCode":"XXXXXXX","pairingExpiration":1589473817597,"dateCreated":1589472917597,"facade":"merchant","token":"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx","label":"capsulflask"}]}
```
And you should see the token in the btcpay server UI:
![](images/paired.jpg)
Now simply set your `BTCPAY_PRIVATE_KEY` variable in `.env`
NOTE: make sure to use single quotes and replace the new lines with \n.
```
BTCPAY_PRIVATE_KEY='-----BEGIN EC PRIVATE KEY-----\nEXAMPLEIArx/EXAMPLEKH23EXAMPLEsYXEXAMPLE5qdEXAMPLEcFHoAcEXAMPLEK\noUQDQgAEnWs47PT8+ihhzyvXX6/yYMAWWODluRTR2Ix6ZY7Z+MV7v0W1maJzqeqq\nNQ+cpBvPDbyrDk9+Uf/sEaRCma094g==\n-----END EC PRIVATE KEY-----'
```
-----
## <a name="testing"></a>testing cryptocurrency payments
I used litecoin to test cryptocurrency payments, because its the simplest & lowest fee cryptocurrency that BTCPay server supports. You can download the easy-to-use litecoin SPV wallet `electrum-ltc` from [github.com/pooler/electrum-ltc](https://github.com/pooler/electrum-ltc) or [electrum-ltc.org](https://electrum-ltc.org/), set up a wallet, and then either purchase some litecoin from an exchange, or [ask Forest for some litecoin](https://sequentialread.com/capsul-rollin-onwards-with-a-web-application/#sqr-comment-container) to use for testing.
## <a name="0_conf_diagram"></a>sequence diagram explaining how BTC payment process works (how we accept 0-confirmation transactions 😀)
![btcpayment_process](images/btcpayment_process.png)
This diagram was created with https://app.diagrams.net/.
To edit it, download the <a download href="readme/btcpayment_process.drawio">diagram file</a> and edit it with the https://app.diagrams.net/ web application, or you may run the application from [source](https://github.com/jgraph/drawio) if you wish.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

View File

@ -1,90 +0,0 @@
# Configuring Capsul-Flask
Create a `.env` file to set up the application configuration:
```
cp .env.sample .env
nano .env
```
You can enter any environment variables referenced in [`__init__.py`](../capsulflask/__init__.py) to this file.
For example you may enter your SMTP credentials like this:
```
MAIL_USERNAME=forest@nullhex.com
MAIL_DEFAULT_SENDER=forest@nullhex.com
MAIL_PASSWORD=**************
```
## <a name="example"></a>Example configuration from capsul.org (production):
```
#LOG_LEVEL=DEBUG
BASE_URL="https://capsul.org"
# hub url is used by the SPOKE_MODE to contact the hub. Since this server is the hub,
# this is fine. In fact it runs into problems (routing related?) when I set it to capsul.org.
# similarly the baikal "spoke" (set up in the hosts table in the db) has "http://localhost:5000" as the https_url
HUB_URL="http://localhost:5000"
HUB_MODE_ENABLED="t"
SPOKE_MODE_ENABLED="t"
HUB_MODEL="capsul-flask"
SPOKE_MODEL="shell-scripts"
SPOKE_HOST_ID="baikal"
SPOKE_HOST_TOKEN="<redacted>"
HUB_TOKEN="<redacted>"
# smtp.. see https://flask-mail.readthedocs.io/en/latest/#configuring-flask-mail
MAIL_SERVER="smtp.nullhex.com"
# MAIL_USE_SSL means SMTP with STARTTLS
MAIL_USE_SSL=true
# MAIL_USE_TLS means SMTP wrapped in TLS
MAIL_USE_TLS=false
MAIL_PORT="465"
MAIL_USERNAME="capsul@nullhex.com"
MAIL_PASSWORD="<redacted>"
MAIL_DEFAULT_SENDER="capsul@nullhex.com"
# stripe
STRIPE_SECRET_KEY="sk_live_<redacted>"
STRIPE_PUBLISHABLE_KEY="pk_live_tGDHY7kBwqC71b4F0N7LZdGl00GZOw0iNJ"
# internal
SECRET_KEY="<redacted>"
POSTGRES_CONNECTION_PARAMETERS="sslmode=verify-full sslrootcert=letsencrypt-root-ca.crt host=postgres.cyberia.club port=5432 ...<redacted>"
# btcpay server
BTCPAY_URL="https://beeteeceepae2.cyberia.club"
BTCPAY_PRIVATE_KEY='-----BEGIN EC PRIVATE KEY-----\n<redacted>\n-----END EC PRIVATE KEY-----'
```
## <a name="config_that_lives_in_db"></a>Configuration-type-stuff that lives in the database
- `hosts` table:
- `id` (corresponds to `SPOKE_HOST_ID` in the config)
- `https_url`
- `token` (corresponds to `SPOKE_HOST_TOKEN` in the config)
- `os_images` table:
- `id`
- `template_image_file_name`
- `description`
- `deprecated`
- `vm_sizes` table:
- `id`
- `dollars_per_month`
- `memory_mb`
- `vcpus`
- `bandwidth_gb_per_month`
## <a name="docker_secrets"></a>Loading variables from files (docker secrets)
To support [Docker Secrets](https://docs.docker.com/engine/swarm/secrets/), you can also load secret values from files for example, to load `MAIL_PASSWORD` from `/run/secrets/mail_password`, set
```sh
MAIL_PASSWORD_FILE=/run/secrets/mail_password
```

View File

@ -1,50 +0,0 @@
# capsul-flask's relationship to its Database Server
Capsul has a ["hub and spoke" architecture](./architecture.md). The "Hub" runs the web application and talks to the Postrges database, while the "Spoke"s are responsible for creating/managing virtual machines. One instance of the capsul-flask application can run in both hub mode and spoke mode at the same time, however there must only be one instance of the app running in "Hub" mode at any given time.
The Postgres connections parameters are [configurable](./configuration.md).
## <a name="schema_management"></a>Database schema management (schema versions)
capsul-flask has a concept of a schema version. When the application starts, it will query the database for a table named `schemaversion` that has one row and one column (`version`). If the `version` it finds is not equal to the `desiredSchemaVersion` variable set in `db.py`, it will run migration scripts from the `schema_migrations` folder one by one until the `schemaversion` table shows the correct version.
For example, the script named `02_up_xyz.sql` should contain code that migrates the database from schema version 1 to schema version 2. Likewise, the script `02_down_xyz.sql` should contain code that migrates from schema version 2 back to schema version 1.
**IMPORTANT: if you need to make changes to the schema, make a NEW schema version. DO NOT EDIT the existing schema versions.**
In general, for safety, schema version upgrades should not delete data. Schema version downgrades will simply throw an error and exit for now.
## <a name="manual_queries"></a>Running manual database queries
You can manually mess around with the database like this:
```
pipenv run flask cli sql -f test.sql
```
```
pipenv run flask cli sql -c 'SELECT * FROM vms'
```
This one selects the vms table with the column name header:
```
pipenv run flask cli sql -c "SELECT string_agg(column_name::text, ', ') from information_schema.columns WHERE table_name='vms'; SELECT * from vms"
```
How to modify a payment manually, like if you get a chargeback or to fix customer payment issues:
```
$ pipenv run flask cli sql -c "SELECT id, created, email, dollars, invalidated from payments"
1, 2020-05-05T00:00:00, forest.n.johnson@gmail.com, 20.00, FALSE
$ pipenv run flask cli sql -c "UPDATE payments SET invalidated = True WHERE id = 1"
1 rows affected.
$ pipenv run flask cli sql -c "SELECT id, created, email, dollars, invalidated from payments"
1, 2020-05-05T00:00:00, forest.n.johnson@gmail.com, 20.00, TRUE
```
## how to view the logs on the database server (legion.cyberia.club)
`sudo -u postgres pg_dump capsul-flask | gzip -9 > capsul-backup-2021-02-15.gz`

View File

@ -1,87 +0,0 @@
# Deploying Capsul on a server
Capsul has a ["hub and spoke" architecture](./architecture.md). The "Hub" runs the web application and talks to the Postrges database, while the "Spoke"s are responsible for creating/managing virtual machines. One instance of the capsul-flask application can run in both hub mode and spoke mode at the same time, however there must only be one instance of the app running in "Hub" mode at any given time.
## <a name="spoke_mode_prerequisites"></a>Installing prerequisites for Spoke Mode
On your spoke (see [Architecture](./architecture.md) You'll need `libvirtd`, `dnsmasq`, and `qemu-kvm`, plus a `/tank` diectory with some operating system images in it:
```
sudo apt install libvirt-daemon-system virtinst git dnsmasq qemu qemu-kvm
sudo mkdir -p /var/www /tank/{vm,img,config}
sudo mkdir -p /tank/img/debian/10
cd !$
sudo wget https://cloud.debian.org/images/cloud/buster/20201023-432/debian-10-genericcloud-amd64-20201023-432.qcow2 -O root.img.qcow2
```
TODO: network set-up
TODO: cyberia-cloudinit.yml
## Deploying capsul-flask
### <a name="deploy_manually"></a>Manually
Follow the [local set-up instructions](./local-set-up.md) on your server.
Make sure to set `BASE_URL` correctly, generate your own secret tokens, and
configure your own daemon management for the capsul-flask server (e.g. writing
init scripts, or SystemD unit files).
Use the suggested `gunicorn` command (with appropriately-set address and port),
instead of `flask run`, to launch the server.
For example, here is the SystemD service unit file we use in production for `capsul.org`:
```
[Unit]
Description=capsul-flask virtual machines as a service
After=network.target
[Service]
ExecStart=/usr/local/bin/pipenv run gunicorn --bind 127.0.0.1:5000 -k gevent --worker-connections 1000 app:app
Restart=on-failure
WorkingDirectory=/opt/capsul-flask
[Install]
WantedBy=multi-user.target
```
TODO: cron runner is required to run maintenance tasks for now, but in the future we want to build this into the python based task scheduler.
### <a name="coop_cloud_docker"></a> Using Co-op Cloud's vanilla Docker Swarm configuration
Download the Co-op Cloud swarm `compose.yml`:
```sh
wget https://git.autonomic.zone/coop-cloud/capsul/src/branch/main/compose.yml
```
Optionally, download add-on compose files for Stripe, BTCPay, and Spoke Mode:
```sh
wget https://git.autonomic.zone/coop-cloud/capsul/src/branch/main/compose.{stripe,btcpay,spoke}.yml
```
Then, create a `.env` file and configure appropriately -- you probably want to
define most settings in [the Co-op Cloud `.envrc.sample`
file](https://git.autonomic.zone/coop-cloud/capsul/src/branch/main/.envrc.sample).
Load the environment variables (using Python `direnv`, or a manual `set -a && source .env && set +a`), insert any necessary secrets, then run the deployment:
```sh
docker stack deploy -c compose.yml -c compose.stripe.yml your_capsul
```
(where you'd add an extra `-c compose.btcpay.yml` for each optional compose file
you want, and set `your_capsul` to the "stack name" you want).
TODO: cron runner
### <a name="coop_cloud_abra"></a> Using Co-op Cloud's `abra` deployment tool
Follow [the guide in the README for the Co-op Cloud capsul package](https://git.autonomic.zone/coop-cloud/capsul/).
### Using docker-compose
TODO

View File

@ -1,68 +0,0 @@
# How to run Capsul locally
## <a name="manually"></a>Manually
Ensure you have the pre-requisites for the psycopg2 Postgres database adapter package:
```sh
sudo apt install python3-dev libpq-dev
pg_config --version
```
Ensure you have the wonderful `pipenv` python package management and virtual environment cli:
```sh
sudo apt install pipenv
```
Create python virtual environment and install packages:
```sh
pipenv install
```
Run an instance of Postgres (I used docker for this, you can use whatever you want, point is its listening on `localhost:5432`):
```sh
docker run --rm -it -e POSTGRES_PASSWORD=dev -p 5432:5432 postgres
```
Run the app
```sh
pipenv run flask run
```
or, using Gunicorn:
```sh
pipenv run gunicorn --bind 127.0.0.1:5000 -k gevent --worker-connections 1000 app:app
```
Note that by default when running locally, the `SPOKE_MODEL` is set to `mock`, meaning that it won't actually try to spawn vms.
## Crediting your account
Once you log in for the first time, you will want to give yourself some free capsulbux so you can create fake capsuls for testing.
```sh
pipenv run flask cli sql -c "INSERT INTO payments (email, dollars) VALUES ('<your email address here>', 20.00)"
```
## Running scheduled tasks:
```sh
pipenv run flask cli cron-task
```
## <a name="docker_compose"></a>Run locally with docker-compose
If you have Docker and Docker-Compose installed, you can use the
`3wordchant/capsul-flask` Docker image to launch capsul-flask, and a Postgres
database server, for you:
```sh
docker-compose up
```
`capsul-flask` will read settings from your `.env` file as usual; you can set any of the options mentioned in the [configuration documentation](./configuration.md).

View File

@ -1,34 +0,0 @@
## automated testing
Automated tests could make it safer to contribute code, easier to review new code, and much easier to refactor, or upgrade Python dependencies.
To run tests:
1. create a Postgres database called `capsulflask_test`
- e.g.: `docker exec -it 98e1ddfbbffb createdb -U postgres -O postgres capsulflask_test`
- (`98e1ddfbbffb` is the docker container ID of the postgres container)
2. run `python3 -m unittest; cat unittest-log-output.log; rm unittest-log-output.log`
**NOTE** that right now we can't figure out how to get the tests to properly output the log messages that happened when they failed, (or passed), so for now we have hacked it to write to a file.
### Architecture
I tried to make the absolute minimal changes to be able to override settings in tests possible alternative approaches include accepting an argument to create_app() to define which env file to load, or adding conditional logic to create_app() to pre-load specific settings before running load_dotenv() but allowing env vars to override dotenv vars seemed cleanest. (Thanks @forest for improving on this approach)
### Creating test databases
One outstanding question is how to initialise/reinitialise the test database.
Currently, the tests rely on the existence of a capsulflask_test database on localhost, accessible by the postgres user with password dev.
I create this manually using:
`docker exec -it 98e1ddfbbffb createdb -U postgres -O postgres capsulflask_test`
where `98e1ddfbbffb` is the docker container ID of the postgres container.
In between test runs, you can either drop and recreate that database, or manually clear data using:
`docker exec -it 98e1ddfbbffb psql -U postgres capsulflask_test -c "DELETE FROM vm_ssh_authorized_key; DELETE FROM vm_ssh_host_key; DELETE FROM vms; DELETE FROM login_tokens; DELETE FROM ssh_public_keys; DELETE FROM unresolved_btcpay_invoices; DELETE FROM payments; DELETE FROM payment_sessions; DELETE FROM host_operation; DELETE FROM operations; DELETE FROM api_tokens; DELETE FROM accounts;" `
### Test coverage
This tests the "landing" (public) pages, login, capsul index and creation. I didn't add automated coverage reporting yet, unclear if that seems useful.

View File

Before

Width:  |  Height:  |  Size: 35 KiB

After

Width:  |  Height:  |  Size: 35 KiB

View File

Before

Width:  |  Height:  |  Size: 190 KiB

After

Width:  |  Height:  |  Size: 190 KiB

View File

Before

Width:  |  Height:  |  Size: 41 KiB

After

Width:  |  Height:  |  Size: 41 KiB

View File

Before

Width:  |  Height:  |  Size: 49 KiB

After

Width:  |  Height:  |  Size: 49 KiB

View File

Before

Width:  |  Height:  |  Size: 22 KiB

After

Width:  |  Height:  |  Size: 22 KiB