
Flexible Premium Plans
Get full access to all the platform's incredible open source tools with each plan.
Only pay for the hours and storage you need, scale when you need, and cancel any time.
EARLY ACCESS PRICING!
We're offering heavily discounted pricing for our first community of users!
Subscribe today and secure this pricing for your entire first year of use.
Academia
FREE¹
.
⫟
Access to Open Source Tools
-
Data Science
-
Machine Learning
-
Business Intelligence
Learning Acceleration Tools
-
Sample Notebooks
-
Hands on Webinars
-
Community Platform
Free 40 hours ² to get you started
Free 5 GIG of cloud storage to take your code and data with you everywhere.
Community Support
Individual
CA$14.99
PER MONTH
⫟
Access to Open Source Tools
-
Data Science
-
Machine Learning
-
Business Intelligence
Learning Acceleration Tools
-
Expert Notebooks
-
Hands on Webinars
-
Community Platform
160 hours ² to develop your code.
100 GIG of cloud storage to take your code and data with you everywhere.
Top off hours & storage on demand, more when you need more!
Help Desk Support
Teams
CA$29.99
PER MONTH
⫟
Access to Open Source Tools
-
Data Science
-
Machine Learning
-
Business Intelligence
Learning Acceleration Tools
-
3 hours of of one on one sessions
-
Expert Notebooks
-
Hands on Webinars
-
Community Platform
200 hours ² to give everyone plenty of time to execute code.
200 GIG of cloud storage to share with your team and keep all your data in one place.
Help Desk Support
Enterprise
CONTACT
.
⫟
Custom Open Source Tools
-
Data Science
-
Machine Learning
-
Business Intelligence
Acceleration Learning Tools
-
Expert Notebooks
-
Hands on Webinars
-
Internal Training
Deployment Options
-
Microsoft Azure
-
Amazon AWS
-
Google Cloud
-
Others
Dedicated Customer Success resource
1. Requires a credit card to subscribe. Usage beyond the free hours and storage will be chargeable.
2. Plan hours are calculated based on application run time. For more details see the FAQ.
Yes! We have incorporated layers of security in order to ensure your data is private and secure. Here are a few examples of how we do that:
Your connection is encrypted with 128bit encryption, so all data exchanged between your browser and Digital Hub™ are secure in transmission. Your account is protected by state of the art authentication and authorization solutions. Your payments are secure through leading payment service providers.
The default language is Python, R and Julia. Our development team is constantly surveying the open source and will integrate additional language in future. If you have a specific language in mind that should be on Digital Hub™, please share it with us on our community at https://community.digitalhub.io
The following open source packages are available as part of the standard offering. These packages are up to date and maintained by their respective open-source communities and integrated, tested to work by our own data science team:
Data Processing:
Pandas
Numpy
Beautifulsoup
xlrd
Statistical Analysis
Scipy
Machine Learning
Scikit-learn
Visualization
Matplotlib
Seaborn
Postgres-access
Psycopg2
PgSpecial
Yes! We have incorporated layers of security in order to ensure your data is private and secure. Here are a few examples of how we do that:
Your connection is encrypted with 128bit encryption, so all data exchanged between your browser and Digital Hub™ are secure in transmission. Your account is protected by state of the art authentication and authorization solutions. Your payments are secure through leading payment service providers.
The default language is Python, R and Julia. Our development team is constantly surveying the open source and will integrate additional language in future. If you have a specific language in mind that should be on Digital Hub™, please share it with us on our community at https://community.digitalhub.io
The default language is Python, R and Julia. Our development team is constantly surveying the open source and will integrate additional language in future. If you have a specific language in mind that should be on Digital Hub™, please share it with us on our community at https://community.digitalhub.io
Yes! We have incorporated layers of security in order to ensure your data is private and secure. Here are a few examples of how we do that:
Your connection is encrypted with 128bit encryption, so all data exchanged between your browser and Digital Hub™ are secure in transmission. Your account is protected by state of the art authentication and authorization solutions. Your payments are secure through leading payment service providers.
A cloud-native Postgres database is created by default for each project. This default database is automatically linked with other project applications such as Superset and Grafana as the default data source. The default database URI is also made available in your Jupyterlab as an environmental variable.
External databases may be configured manually. Currently only Postgres Databases client are natively installed. However, you may configure access to other databases by installing the corresponding client packages. (See this blog for more instructions). In Jupyterlab, you may access your database with a proper SQAlchemy URI with the sqlalchemy library. In Superset and Grafana, you may integrate with your database by adding it as a data source on the application interface.
The following open source packages are available as part of the standard offering. These packages are up to date and maintained by their respective open-source communities and integrated, tested to work by our own data science team:
Data Processing:
Pandas
Numpy
Beautifulsoup
xlrd
Statistical Analysis
Scipy
Machine Learning
Scikit-learn
Visualization
Matplotlib
Seaborn
Postgres-access
Psycopg2
PgSpecial
A cloud-native Postgres database is created by default for each project. This default database is automatically linked with other project applications such as Superset and Grafana as the default data source. The default database URI is also made available in your Jupyterlab as an environmental variable.
External databases may be configured manually. Currently only Postgres Databases clients are natively installed. However, you may configure access to other databases by installing the corresponding client packages. (See this blog for more instructions). In Jupyterlab, you may access your database with a proper SQAlchemy URI with the sqlalchemy library. In Superset and Grafana, you may integrate with your database by adding it as a data source on the application interface.
A cloud-native Postgres database is created by default for each project. This default database is automatically linked with other project applications such as Superset and Grafana as the default data source. The default database URI is also made available in your Jupyterlab as an environmental variable.
External databases may be configured manually. Currently only Postgres Databases client are natively installed. However, you may configure access to other databases by installing the corresponding client packages. (See this blog for more instructions). In Jupyterlab, you may access your database with a proper SQAlchemy URI with the sqlalchemy library. In Superset and Grafana, you may integrate with your database by adding it as a data source on the application interface.
The following open source packages are available as part of the standard offering. These packages are up to date and maintained by their respective open-source communities and integrated, tested to work by our own data science team:
Data Processing:
Pandas
Numpy
Beautifulsoup
xlrd
Statistical Analysis
Scipy
Machine Learning
Scikit-learn
Visualization
Matplotlib
Seaborn
Postgres-access
Psycopg2
PgSpecial
The following open source packages are available as part of the standard offering. These packages are up to date and maintained by their respective open-source communities and integrated, tested to work by our own data science team:
Data Processing:
Pandas
Numpy
Beautifulsoup
xlrd
Statistical Analysis
Scipy
Machine Learning
Scikit-learn
Visualization
Matplotlib
Seaborn
Postgres-access
Psycopg2
PgSpecial
A cloud-native Postgres database is created by default for each project. This default database is automatically linked with other project applications such as Superset and Grafana as the default data source. The default database URI is also made available in your Jupyterlab as an environmental variable.
External databases may be configured manually. Currently only Postgres Databases client are natively installed. However, you may configure access to other databases by installing the corresponding client packages. (See this blog for more instructions). In Jupyterlab, you may access your database with a proper SQAlchemy URI with the sqlalchemy library. In Superset and Grafana, you may integrate with your database by adding it as a data source on the application interface.
When you delete your account your ID will be removed from the system. This means you lose access to the platform immediately and if you are subscribed to any plan, the subscription will be cancelled. Your projects and data will automatically be wiped out of the system. Please back up your data before you delete your account.
The default language is Python, R and Julia. Our development team is constantly surveying the open source and will integrate additional language in future. If you have a specific language in mind that should be on Digital Hub™, please share it with us on our community at https://community.digitalhub.io
You can register for an Academia plan using promotion codes available on our site. Please note promotion codes expire so register while they are valid.
For a step by step instruction, please follow this guide:
https://digitalhubsupport.atlassian.net/wiki/spaces/DKB/pages/88997889/How+to+use+promotion+codes
All support questions should be directed to our engineering team via email. You may email us at info@digitalhub.io. For general how to questions, please reach us on the community page on https://community.digitalhub.io
All support questions should be directed to our engineering team via email. You may email us at info@digitalhub.io. For general how to questions, please reach us on the community page on https://community.digitalhub.io
You can register for an Academia plan using promotion codes available on our site. Please note promotion codes expire so register while they are valid.
For a step by step instruction, please follow this guide:
https://digitalhubsupport.atlassian.net/wiki/spaces/DKB/pages/88997889/How+to+use+promotion+codes
All support questions should be directed to our engineering team via email. You may email us at info@digitalhub.io. For general how to questions, please reach us on the community page on https://community.digitalhub.io
All support questions should be directed to our engineering team via email. You may email us at info@digitalhub.io. For general how to questions, please reach us on the community page on https://community.digitalhub.io
You can register for an Academia plan using promotion codes available on our site. Please note promotion codes expire so register while they are valid.
For a step by step instruction, please follow this guide:
https://digitalhubsupport.atlassian.net/wiki/spaces/DKB/pages/88997889/How+to+use+promotion+codes
The creator of the project owns the data and IP, for full details please see our terms and conditions page.
Submit a request to our development team through info@digitalhub.io
Submit a request to our development team through info@digitalhub.io
Submit a request to our development team through info@digitalhub.io
Submit a request to our development team through info@digitalhub.io
Your subscription includes access to the Digital Hub™ platform and all its features including team management, project management, and Data Science applications. Applications use compute and require storage. We have simplified the app usage by categorizing them in time and space. Your subscription includes a certain number of hours to develop and utilize the platform's tools, plus space to store your data.
You can switch between plans. However, when you downgrade you would lose your allocated storage and hours as well as your data. We highly recommend you back up your data before you downgrade your plan or contact us to inquire about your unique situation at info@digitalhub.io before downgrading your plan to ensure minimum impact.
Our community is a free service to help you transition into Data Science space. We share useful articles and material to help you learn and grow with us. The founders of Digital Hub™ have many years of industry and data science experience and the community is a place for all of us to share ideas and help each other.
Visit: https://community.digitalhub.io for more info.
Our community is a free service to help you transition into Data Science space. We share useful articles and material to help you learn and grow with us. The founders of Digital Hub™ have many years of industry and data science experience and the community is a place for all of us to share ideas and help each other.
Visit: https://community.digitalhub.io for more info.
Our community is a free service to help you transition into Data Science space. We share useful articles and material to help you learn and grow with us. The founders of Digital Hub™ have many years of industry and data science experience and the community is a place for all of us to share ideas and help each other.
Visit: https://community.digitalhub.io for more info.
Digital Hub™ was created with love by the Calgary-based team at Integra Data and Analytic Solutions Corp. We are a group of engineers and data scientists who felt there was a gap in enabling enterprise grade infrastructure for day to day data science needs. We were looking for a flexible and automated online way to easily access and integrate various data science and development tools. We wanted a tool that could fill the gap in open-source echo systems, such as security, authentication, version control, scalability, and collaboration.
We created Digital Hub™ for our own internal use and are now making it available for everyone else! You can see our stories here and reach out to us on this site at any time.
Digital Hub™ was created with love by the Calgary-based team at Integra Data and Analytic Solutions Corp. We are a group of engineers and data scientists who felt there was a gap in enabling enterprise grade infrastructure for day to day data science needs. We were looking for a flexible and automated online way to easily access and integrate various data science and development tools. We wanted a tool that could fill the gap in open-source echo systems, such as security, authentication, version control, scalability, and collaboration.
We created Digital Hub™ for our own internal use and are now making it available for everyone else! You can see our stories here and reach out to us on this site at any time.
Digital Hub™ was created with love by the Calgary-based team at Integra Data and Analytic Solutions Corp. We are a group of engineers and data scientists who felt there was a gap in enabling enterprise grade infrastructure for day to day data science needs. We were looking for a flexible and automated online way to easily access and integrate various data science and development tools. We wanted a tool that could fill the gap in open-source echo systems, such as security, authentication, version control, scalability, and collaboration.
We created Digital Hub™ for our own internal use and are now making it available for everyone else! You can see our stories here and reach out to us on this site at any time.
Digital Hub™ was created with love by the Calgary-based team at Integra Data and Analytic Solutions Corp. We are a group of engineers and data scientists who felt there was a gap in enabling enterprise grade infrastructure for day to day data science needs. We were looking for a flexible and automated online way to easily access and integrate various data science and development tools. We wanted a tool that could fill the gap in open-source echo systems, such as security, authentication, version control, scalability, and collaboration.
We created Digital Hub™ for our own internal use and are now making it available for everyone else! You can see our stories here and reach out to us on this site at any time.
Digital Hub™ scales based on dynamic usage of memory, compute and storage by each application. The cost is simplified based on hours and storage usage and each plans offeres a convenient package to get you started.
Additional time and storage can be purchased on demand. You can calculate your cost by using this formula (Plan Price + Top Up Hours + Top Up Storage). If you have a certain scenario in mind, contact us for an estimate at info@digitalhub.io or ask us on the community at https://community.digitalhub.io
When you delete your account your ID will be removed from the system. This means you lose access to the platform immediately and if you are subscribed to any plan, the subscription will be cancelled. Your projects and data will automatically be wiped out of the system. Please back up your data before you delete your account.
When you delete your account your ID will be removed from the system. This means you lose access to the platform immediately and if you are subscribed to any plan, the subscription will be cancelled. Your projects and data will automatically be wiped out of the system. Please back up your data before you delete your account.
When you delete your account your ID will be removed from the system. This means you lose access to the platform immediately and if you are subscribed to any plan, the subscription will be cancelled. Your projects and data will automatically be wiped out of the system. Please back up your data before you delete your account.
Please contact us at info@digitalhub.io to answer your questions or check our slack channel at https://digitalhubsupport.slack.com/ community page at https://community.digitalhub.io for general inquiries.
Please contact us at info@digitalhub.io to answer your questions or check our slack channel at https://digitalhubsupport.slack.com/ community page at https://community.digitalhub.io for general inquiries.
Please contact us at info@digitalhub.io to answer your questions or check our slack channel at https://digitalhubsupport.slack.com/ community page at https://community.digitalhub.io for general inquiries.
Our community is a free service to help you transition into Data Science space. We share useful articles and material to help you learn and grow with us. The founders of Digital Hub™ have many years of industry and data science experience and the community is a place for all of us to share ideas and help each other.
Visit: https://community.digitalhub.io for more info.
Hours are calculated based on application run time. For example when you create a project with 3 applications, and these applications are in "Play" mode, hours add up per application. You can "Pause" any application that you don't need. You may also "Pause All" applications before you log off from the profile menu.
Hours are calculated based on application run time. For example when you create a project with 3 applications, and these applications are in "Play" mode, hours add up per application. You can "Pause" any application that you don't need. You may also "Pause All" applications before you log off from the profile menu.
In a team project, usage from all members are counted towards the teams subscription owner. As with all projects, the hours are calculated based on total run time of all application instances.
In Digital Hub™, there are two types of applications: project-based, and member-based, and the runtime is calculated differently for each type.
1. Project-based applications (Superset, Grafana): In project-based application, a single instance is deployed per project. All members in the project logs in and work on the same application instance. For example, if your Superset instance is ‘ready’ and running for three hours while two members work in it synchronously, the total run time is three hours.
2. Member-based application (Jupyterlab, H2O): In member-based applications, an instance is deployed per member. Members will start their own instances to work on. For example, both Member A and Member B spend 1 hour working in Jupyterlab, the total application runtime would be 2 hours, because each member is running their own instances.
Here is an example where a team of three works on a project for three hours. During these three hours, all members see the ‘ready’ status for all of their applications: Superset, Grafana and Jupyterlab. In this scenario, the hours will be calculated as following:
(# Hours) * (# Project-based Apps) + (# Hours) * (# Members on Jupyterlab)
= (3 hrs * 2 apps) + (3 hrs * 3 members)
= 15 hrs
In a team project, usage from all members are counted towards the teams subscription owner. As with all projects, the hours are calculated based on total run time of all application instances.
In Digital Hub™, there are two types of applications: project-based, and member-based, and the runtime is calculated differently for each type.
1. Project-based applications (Superset, Grafana): In project-based application, a single instance is deployed per project. All members in the project logs in and work on the same application instance. For example, if your Superset instance is ‘ready’ and running for three hours while two members work in it synchronously, the total run time is three hours.
2. Member-based application (Jupyterlab, H2O): In member-based applications, an instance is deployed per member. Members will start their own instances to work on. For example, both Member A and Member B spend 1 hour working in Jupyterlab, the total application runtime would be 2 hours, because each member is running their own instances.
Here is an example where a team of three works on a project for three hours. During these three hours, all members see the ‘ready’ status for all of their applications: Superset, Grafana and Jupyterlab. In this scenario, the hours will be calculated as following:
(# Hours) * (# Project-based Apps) + (# Hours) * (# Members on Jupyterlab)
= (3 hrs * 2 apps) + (3 hrs * 3 members)
= 15 hrs
As the owner of the Teams plan, you have the ability to assign roles to people you invite to your projects. A single user can be added to multiple projects and have different roles in each project.
Creator: You are the Creator of the project if you are subscribed to the Teams plan. This is your default role every time a project is created. As Creator, you manage the plan, payments, projects, and the team.
Owner: Owner is a user who has Write and Read access when it comes to data and codes. They can manipulate the data and database, create dashboard and interact with them. However, this permission level doesn't allow them to create projects, add/remove users or manage the payments. As Owner, you can manage projects and the team.
Reader: These users only have Read access to projects and applications. They can't alter the code or manipulate the data, but they are able to view and interact with the code and dashboards. As a reader you can view projects and outputs from various software components.
As the owner of the Teams plan, you have the ability to assign roles to people you invite to your projects. A single user can be added to multiple projects and have different roles in each project.
Creator: You are the Creator of the project if you are subscribed to the Teams plan. This is your default role every time a project is created. As Creator, you manage the plan, payments, projects, and the team.
Owner: Owner is a user who has Write and Read access when it comes to data and codes. They can manipulate the data and database, create dashboard and interact with them. However, this permission level doesn't allow them to create projects, add/remove users or manage the payments. As Owner, you can manage projects and the team.
Reader: These users only have Read access to projects and applications. They can't alter the code or manipulate the data, but they are able to view and interact with the code and dashboards. As a reader you can view projects and outputs from various software components.