Table of Contents
This document provides Information Technology professionals with comprehensive information on the technical architecture of the Wizdom Intranet (by LiveTiles) platform for deployment into your Microsoft Office 365 and Azure environments.
People use LiveTiles to connect to the expertise, information and services they need to get their jobs done more effectively—no matter where they are, what role they fulfil, or what device they are using.
LiveTiles provides a single vendor with comprehensive capabilities that can be deployed as required:
- Wizdom Intranet for SharePoint
- Wizdom for Teams
- Page Designer for SharePoint
- Hyperfish : Employee Profile Manager and Automated Org Chart
- Intelligence - Analytics
- Bots and Conversational Assistants
The purpose of this document is to list and describe the requirements that the Wizdom product installed on Office 365 tenant and Microsoft Azure.
The product can be installed on an existing customer Azure subscription or hosted on Wizdom’s Azure portal.
Wizdom need to be installed in a SharePoint Online site collection in Office 365 and has integration to
Microsoft Azure and other Office 365 services. For Wizdom can complete the installation, there are some
prerequisites that must in place in the customer Azure environment and for the on-premises environment, there are properly also some technical issues that must be clarified and sorted out first.
Wizdom can offer services regarding Office 365 and Azure subscriptions, integration of the customer cloud environments for an alignment to match the installation of Wizdom.
If the customer has an existing cloud installation on Microsoft Azure or would like to create this themselves, Wizdom has some requirement that will need to in place before starting on the installation.
Office 365 has the following requirements and recommendations:
- An Office 365 tenant must be activated, all users either synchronized to cloud by using Azure AD
Connect or created as non-federated users *onmicrosoft.com in the cloud directly.
- It is recommended that consultants from Wizdom has access to an account with Global Administrator rights to the customer Office 365 tenant to be able to assist with the installation and configuration of the Wizdom setup.
Account can be created as a non-federated user (*.onmicrosoft.com).
The following roles are as minimum required in order to provision Wizdom in SharePoint Online, including adding Azure AD app and access to Azure resource group (SaaS).
If the customer has objections and cannot comply with these requirements, then Wizdom as minimum must have site collection owner rights to the global app catalog in Office 365. This scenario will the customer be responsible for installing apps in the app catalog in Office 365 by following instructions from Wizdom.
SharePoint Online has the following requirements and recommendations:
- The selected SharePoint Online Plan shall as minimum provide access to the number of users that the customer expects and the number of private site collections that correspond to the number of Wizdom instances that customer expecting
- All users that will need to access Wizdom must be assigned a license SharePoint Online license as
- Wizdom would like to emphasize that the chosen name on the customer tenant will be used on all
private site collections in SharePoint Online. The name cannot be changed afterwards.
Let’s give an example:
The customer assigns a new Office 365 tenant and select to use the name “Wizdom365”. The first site collection will be named https://wizdom365.sharepoint.com and subsequent site collections will be placed under https://wizdom365.sharepoint.com/-managed-path>/<site-collection-url>/ i.e.
Managed Path can either be “/sites/” or “/teams/” and be aware of that it’s not possible to use custom managed paths in Office 365.
Wizdom can also be installed on the root site collection https://company.sharepoint.com in SharePoint Online using the existing team site template. If a decision has been selected to use the root site collection, custom scripting must be enabled at least 24 hours before setting up Wizdom. Custom scripting can be found and activated in SharePoint Online Admin Center > Custom Script (personal and self-service sites)
Azure has the following requirements and recommendations:
- It’s recommended that Wizdom is co-admin to able to provision Azure services and databases.
If the customer cannot comply with these recommendations, then as an alternative create a
dedicated Azure subscription for the purpose. More information on Azure subscriptions can be
found here: https://azure.microsoft.com/en-us/pricing/?b=16.43.
- It’s a requirement that Wizdom has read access to the Azure Active Directory since Wizdom need to
add an application through the Azure portal. Client ID and Client Secret must be hand over to Wizdom after the creation. More information regarding access and rights to other application can be found here: https://docs.microsoft.com/en-us/azure/active-directory/active-directory-integratingapplications#BKMK_Graph
The Azure Services can be scaled in different “packages” and services are affected immediately when changing the scale plan. Wizdom recommend that the web app has the following scale plan as minimum to provide faster performance:
APP Service Plan Tier: Standard
To increase or decrease a service plan, has an immediate effect on cost and the customer must be aware of this.
Wizdom has no special requirements to the client operating system besides the requirements that Microsoft has listed for Office 365. More information can be found here:
The following table will be listed as a task where the customer must prepare information needed before the installation of Wizdom can take place.
Also, please take notice of that Wizdom requires a special account for search in SharePoint Online and this account can be a non-federated account (*.onmicrosoft.com) and does not requires any licenses assigned.
Please be aware of that Wizdom requires this user account not to have an expiring password. More information on how to disable password expiration can be found here: https://support.office.com/en-us/article/Set-an-individual-user-s-password-to-never-expire-f493e3af-e1d8-4668-9211-230c245a0466?ui=en-US&rs=en-US&ad=US&fromAR=1
The following table need input from the customer for Wizdom A/S to start on the setup and installation of the Wizdom product.
When table inputs are completed, please send information back to the contact person at Wizdom A/S.
|Name of the Office 365 admin center URL
|List the URL where Wizdom will be located
|The full name of the customer
|This is the Wizdom A/S SC admin user for the SC of Tenant Admin Center
|The user required for creating modern team sites
|The user that is required for search
|i.e. 300 users
|Number of users that will access Wizdom
|Name, email and phone number
|Information on who is the responsible point of contact at the customer
Wizdom highly recommend that the customer assigns Wizdom as Delegated Admin on the Office 365 Tenant.
When Wizdom administer an organization’s account on behalf of the customer, Wizdom will be providing
delegated administration. As a delegated administrator, Wizdom can perform tasks such as adding users,
resetting passwords, and adding domains.
Before Wizdom can start administering a customer account, the customer must authorize Wizdom as a
delegated administrator. To get customer approval, Wizdom A/S first send the customer an offer for
delegated administration, which Wizdom can include with a trial invitation or purchase offer. Wizdom can also offer delegated administration to the customer client later.
This will also be an extra level of support that the customer can take advantage of.
LiveTiles requires your organization to have an active Office 365 subscription or SharePoint Server infrastructure. Azure components can be hosted by LiveTiles or in your Azure tenant.
LiveTiles deployed into your Office 365 will require your employees to be licensed with any of the following Microsoft plans:
- SharePoint Online Plan 1/2
- E1, E3, E5, F1
LiveTiles can be deployed to either Office 365 SharePoint Online or to SharePoint Server 2013, 2016, 2019 or in a hybrid capacity. The Office 365 version provides richer capability.
LiveTiles can be deployed to either Classic or Modern SharePoint sites. However, the Modern SharePoint experience is recommended to provide superior content administration capabilities
LiveTiles components can be deployed to every site collection in a SharePoint deployment or targeted at specific site collections as needed.
LiveTiles will function on all browsers supported by Microsoft for Office 365. https://products.office.com/en-us/office-system-requirements
- Internet Explorer 11
LiveTiles technologies are purpose built for your investment in Microsoft Office 365 and Azure. As such, our technologies respect and inherit in full the capabilities of Microsoft as it relates to information security, privacy, compliance, GDPR, data location, data loss prevention, eDiscovery, retention policies etc. Refer to the official Microsoft references below:
- Office 365 Security & Compliance Center
- Azure Trust Center
No, LiveTiles does not host your intranet content and data. SharePoint data including pages, data in lists and documents in libraries etc are stored 100% in your SharePoint environment in accordance with your agreement with Microsoft. Only configuration data, branding and logic is rendered by the Wizdom Intranet back-end outside of SharePoint, but this can also deploy to your Azure tenant. The one exception to this is the “Noticeboard” which uses Azure to store posts for the purpose of disparate information sharing across site collections.
There are two applications where LiveTiles may have access to data:
- Hyperfish (Employee Profile Manager) which scans AD for missing or incorrect user profile information using a locally installed agent. This is done for the purposes of improving employee profile data to deliver high quality people / skills directory and the automated organizational chart capabilities.
- Intelligence (Analytics), which stores de-identified user click data across pages in a site in a LiveTiles hosted Microsoft Azure database. The click data represents what content users are navigating to the most on a specific page or overall site and from what device type. This is done for the purposes of collecting usage analytics and reporting so your organization can make informed decisions on how to improve experiences.
Communication between your environment and LiveTiles occurs in the following standard scenarios:
- Secure HTTPS connection to the LiveTiles source hosted on Microsoft Azure for the purposes of updating applications
- Secure HTTPS connection to the LiveTiles Licensing Service (LTLS) hosted on Microsoft Azure for the purpose of establishing licensing status
- Anonymous user telemetry and error capturing via Microsoft Azure Application Insights
The Wizdom Intranet (a SharePoint Add-in) elevates Office 365 and SharePoint based solutions including intranets, extranets and line-of-business portals into enterprise ready digital workplaces. Taking best practices from hundreds of companies and packaging them in a world-class digital workplace solution, Wizdom offers you the most comprehensive selection of state-of-the-art intranet features and business applications without the time, cost and risks associated with custom development.
Wizdom safeguards you with frequent releases that make sure your digital workplace will always benefit from the newest trends and capabilities on the market. Core Wizdom Intranet capabilities delivered in optional MODULES include:
- Delivering a consumer grade user interface (UI) aligned to your identity with DESIGN and BRANDING
- Engaging employees with targeted, relevant CORPORATE NEWS
- Allowing end users to personalize their content experience with NOTICEBOARD
- Simplify access to disparate content and tools from any Office 365 application with POWER PANEL
- Single click location of disparate content and tools from any page with MEGA MENU
- Taking internal and external collaboration in Microsoft Teams to new heights with WORKSPACES
- Helping users find their way with FAQs and QUICKLINKS
- Manage document life cycles and track compliance with POLICIES / PROCEDURES
- Keeping content fresh, relevant and useful with CONTENT GOVERNANCE and PAGE CONTACTS
- Support your global workforce with LANGUAGES and TRANSLATIONS
- Integrating with line-of-business applications eg CRM, HRIS, ITSM with WEBHOOKS
- Anytime, anywhere, any device access with a dedicated iOS and Android MOBILE APP
The Wizdom SharePoint Add-in is installed in the corporate app-catalog. Wizdom can be installed on any number of site collections on the tenant. Site collections with Wizdom installed will have the Wizdom UI, script controls, web parts, page layouts and business applications available. Wizdom can also be easily uninstalled on a site collection, leaving it as a standard SharePoint site collection.
A number of Azure services are used to deliver the solution:
The Wizdom logical architecture has 3 main concepts:
- Web parts in SharePoint
- Wizdom application services
- Wizdom infrastructure services
As a user launches a Wizdom page, the active components on the page query the Wizdom back end for relevant content. The Wizdom application then determines if it needs to serve up new content or can reuse cached content for performance enhancement. The data and content are then returned to the active Wizdom component on the SharePoint page which then render the data and content according to the selected Wizdom template and branding.
Wizdom uses the SharePoint Add-in model (previously named App-model). When using the Add-in model, the Wizdom application has a minimal footprint with application code executed outside of the SharePoint environment, integrating with SharePoint through iFrames, scripts and web service endpoints.
When Wizdom is installed on a site collection, the SharePoint Add-in automatically verifies its validity through the SharePoint appredirect.aspx page and returns an access token that Wizdom can use to query SharePoint and its services. Wizdom has client-side scripts that communicate with Azure hosted REST endpoints.
The Azure endpoints communicate with SharePoint, SQL, Redis Cache etc. The endpoints' incoming requests are verified from the session and SharePoint cache key and processed if valid. Wizdom uses different Azure services (Cache, Database etc.). Credentials for these services are stored in the web.config for the Azure website. Access tokens are retrieved using these credentials.
Wizdom requires the following services to be activated:
- Azure Active Directory (AAD)–AAD is the primary directory for all Microsoft online services including Office 365. Wizdom needs programmatic access to AAD through REST API endpoints provided by registering the Wizdom application in the Azure Portal and granting it API access to perform read operations on directory data and objects.
- Website - Wizdom uses a single website to host all application REST endpoints. The website serves HTTPS requests from the scripts and web parts hosted in a SharePoint page. The website can be scaled up (more CPU) and out (more instances) to support numerous simultaneous users. Scaling websites is easily done in the Scale tab of the Azure portal.
- SQL Database - Wizdom uses a SQL database to store relational data for applications.
- Redis Cache - Redis Cache is a secure, dedicated cache service. Wizdom uses Redis Cache to store intensive and slow queries to optimize response times in applications.
- Storage - Wizdom uses BLOB storage to store configuration that spans site collections. Wizdom also uses Table storage to store diagnostic logging.
Users don’t authenticate into Wizdom per se, rather into your SharePoint solution built with Wizdom through the authentication mechanism you have in place for access to Office 365 or your on-premise server. Any additional authentication arrangements that you have in place e.g. Single Sign-On (SSO) for users accessing SharePoint from other applications or Multifactor Authentication (MFA) will be respected in full by Wizdom.
Wizdom has no implications for “end-users” or general content administrators, as they are simply interacting with SharePoint solutions powered by Wizdom. In summary, when planning for user permissions, you should start by understanding permission levels assigned to core groups in SharePoint–Site Collection Administrators, Owners, Members, Visitors. Solutions built with LiveTiles will 100% inherit and respect whatever permissions model you choose to apply. https://docs.microsoft.com/en-us/sharepoint/understanding-permission-levels
Additional to standard SharePoint user permissions are specific application level permissions that provide the ability to access the Wizdom back end to perform specific configurations. Wizdom supports multiple levels of administrators: Global Administrators can configure any module and Module Administrators can configure a specific module(s).
Product updates are delivered automatically to your environment via a secure HTTPS connection to our source in Microsoft Azure. Note that updates are not automatically applied and administrators decide on whether to apply the updates.
Updates are delivered when new capabilities are available or when bug fixes are required. Updates are always communicated in advance to our clients through release notes documentation built directly into the Wizdom application.
Extension not deprecation
Applied updates focus on extension and not deprecation, so updates will typically not have any impact on existing pages and applications running inside your Wizdom deployment.
Downtime and upgrade scheduling
Upgrading Wizdom does not cause an outage of access to your SharePoint solution, however for 1-2 minutes, some Wizdom features may not be available. As a result, most organizations schedule updates to be applied on a weekend outside of business hours.
Wizdom is deeply extensible in alignment with the technologies and best practices recommended by Microsoft for extending Office 365 and SharePoint solutions. There are two core elements:
Custom client-side components such as web parts can be injected into Wizdom using the SPFX and the standard tool chain. The SPFX can be used to develop components that consume data from other Office 365 applications using e.g. the Graph API, SharePoint API, Power BI API or connected to the REST API services of external vendors. Web parts capable of GET, PUT, POST, DELETE operations connecting to common line-of-business systems including CRM, HRIS and ITSM can be created by developers with web development skills.
Wizdom REST API
The Wizdom API is RESTful API with a JSON payload that can be used to access Wizdom services from external applications. The Wizdom “Webhooks” module allows you to subscribe to certain events in Wizdom. When one of those events is triggered, Wizdom will send an HTTP MPOST payload to the webhooks configured URL.
The Page Designer for the Modern SharePoint application allows organizations to overcome the constraints regarding layout, design and branding imposed on default Modern SharePoint organizations. With Page Designer, companies who want to inject their identity into SharePoint can do so with custom tiles, branding / CSS, page layouts and templates.
Page Designer enables customization of the SharePoint UI in alignment with Microsoft recommended best practices. As a drag and drop application, it empowers web designers and power users to implement designs without the need for specialist skills or the time, costs and risks of customizing the SharePoint UI with code.
Page Designer is deployed into a Modern SharePoint site collection via:
- Deploying an SPPKG package to the App Catalog
- Activating the Single Part App Page as a web part to target site collections
Page Designer follows the Microsoft mechanism for app approvals and can only be added to a site collection following a formal approval process whereby the Site Collection Administrator requests access from the App Catalog / Portal Administrator who approves Page Designer for use in the target site collection.
Access to Page Designer functionality will typically be limited to a small group of users in the following SharePoint permissions groups. These are automatically created upon deployment. Note: the groups with design capabilities should not be confused with content administration permissions controlled through standard Owners, Members, Visitors permissions groups.
- LiveTiles Owners – full control over the application
- LiveTiles Designers – ability to design but not modify elements controlled by LiveTiles Owners
Hyperfish enables organizations to automatically identify and populate missing information in directories, quickly and easily. Utilizing next generation AI and Bot technologies, Hyperfish automates the process of keeping Active Directory and Office Profile information fresh and relevant.
Using Hyperfish, organizations can be more effective by saving time, reducing IT Support overhead, and improving the speed of business communications as well as enhancing already existing Microsoft investments such as Office 365, Exchange, SharePoint, and Delve etc. A summary of benefits includes:
- Improves IT service delivery - reduces manual work associated with keeping user profiles up to date as well as IT responsiveness
- Increases employee engagement - enables people to quickly and easily connect with colleagues and their specific skills
- Reduces the risk of sending information to the wrong person and filters out inappropriate profile information
- Targets content - eg SharePoint news/announcements based on correct profile data
- Effectively automates workflows dependent on correct definitions : eg "Department" or "Manager"
As an organization, you decide on the user profile data that is important for you to collect. For example, this may be specific expertise relevant to industry, clients, people, work or projects aligned with that profile. Data collected can relate to standard default or “extended” attributes inside your AD profile schema. You have complete control over:
- Data validation - what data is collected, what is "read only" vs "editable" and what may be entered
- Approvals - who must review and approve profile updates?
- Collections - setting different profile requirements for specific groups of users; eg by country
Hyperfish uses new technologies such as machine learning, advanced analytics and bot technology to dramatically improve directory content in two phases:
- Continually monitors directories for inconsistent, invalid, aged and missing information
- Contacts users to request and validate information via personalized email workflow requests based on the information required and user preferences
In online deployments, Hyperfish connects directly to Azure Active Directory to scan for the quality of user profile information. For any implementation scenario utilizing an on-premises Active Directory system (on-premises or hybrid), Hyperfish scans for the absence of user profile information using a locally installed agent (referred to as the Hyperfish Agent).
After directory analysis is performed, a full report can be viewed from the Hyperfish web application, where administrative tasks and product configuration can be accessed as well.
Hyperfish can be used in three modes:
- Analyze - directory analysis is performed and a report is generated. Hyperfish does not contact users or write any changes to Active Directory in this mode.
- Pilot - A group of participants can be selected to participate in a small-scale implementation of Hyperfish. The participants receive profile update messages and have the option to update profile information through direct response or by using the Hyperfish profile update page.
- Run - all users in the domain receive profile update messages and have the option to update profile information through direct response, or by using the Hyperfish profile update page.
In both pilot and run modes, specified profile attributes can be selected to pass specified administrator approval before changes are committed.
Fully implemented, Hyperfish comprises two components: Hyperfish - the cloud service and the Hyperfish Agent. Together, these components can analyze and update Active Directory contents regardless of how an organization’s Active Directory topology is configured.
A hybrid Hyperfish deployment with online Azure AD and on-premises AD
An online Hyperfish deployment
As a hosted service, Hyperfish analyzes directories in online (Microsoft Azure Active Directory), on-premises (Microsoft Active Directory), as well as hybrid environments. Hyperfish is built on the Microsoft Azure platform.
Online - Hyperfish connects directly to Azure Active Directory and performs an analysis. A report is generated and users are contacted to update profile information. Collected user information is written back to the Azure Active Directory.
On-premises - Hyperfish generates a report based on results gathered by the on-premises Hyperfish Agent. Once user profile information is gathered, changes are relayed to the Hyperfish Agent and written to the local Active Directory instance.
Hybrid - Hyperfish connects to the on-premises Hyperfish Agent and performs an analysis of Active Directory. A report is generated and users are contacted to update their profile attributes. Collected user information is relayed to the on-premises Hyperfish Agent and written to the local Active Directory instance. The update cycle is complete when Azure Active Directory is synchronized with the on-premises Active Directory instance through Azure AD Connect or Office 365 Directory Synchronization (DirSync).
Hyperfish scans on-premises Active Directory information through the Hyperfish Agent, a locally installed service.
For best results, the Hyperfish Agent should be installed on a domain-joined server that meets or exceeds the minimum system requirements:
Supported Operating Systems: Windows Server 2012 R2 or above | Microsoft .NET Framework 4.5.2 (packaged with installation executable) | Processor: 2 GHz | Memory: 4 GB
Although the agent can run from any domain-joined machine, we recommend it is installed on a secure and consistently available host within your organization’s networked domain.
To securely pair the host machine identity with the Hyperfish cloud service, a ten-character code is generated by the cloud service, provided through the Hyperfish web application interface during the setup process. This code is required during agent installation. When the code is entered during the installation instance, the agent makes an API call to the Hyperfish service and the machine is registered in the Hyperfish database. An authentication token (JSON Web Token) is generated for the agent host and placed in a secure store for future interactions with the Hyperfish service. If connection between the Agent and the Service is severed, subsequent analyses will cease, but no data will be lost.
The Hyperfish service is operated by a service account with read and write permissions to user accounts in Active Directory. Service account permissions should be provisioned by principle of least privilege. Providing permissions to the target AD container using the Delegation of Control Wizard is the easiest method of provisioning rights to the Hyperfish service account.
Employee Profile Manager Page / Web Part
After Hyperfish identifies user accounts that are missing profile attribute information, a conversation is started with the end-users to collect the missing information. Although the users may choose to respond directly through the channel of communication - eg email - a link to a self-service profile update page is also provided.
Employee / Expertise Directory
Automated Organization Chart Web Part
Fully implemented, Hyperfish is comprised of two components: Hyperfish Cloud Service and the Hyperfish Agent. Together, these components can analyze and update Active Directory contents regardless of how an organization’s Active Directory topology is configured.
Hyperfish Cloud Service
The results of the analysis component (completion statistics and calculated percentages for AD properties) are stored by Hyperfish for 30 days, plus the latest data from the most recent analysis. These results include:
- Which AD user property was analyzed
- Date and time when the property was analyzed
If the Profile Validation feature is used, Hyperfish will store all user attribute entries in the cloud service until the Profile Validation feature is disabled. This data can be removed from Hyperfish systems on request.
If a user chooses to update profile information using the Hyperfish profile update page, Hyperfish stores the following information for administrator approval until the change is approved or rejected:
- The name of the user making the update
- The property that was updated
- The new and old value of the updated property
For each user scanned, the Hyperfish cloud service indefinitely stores the following properties:
- User identifier (Office 365 only)
- Object GUID (On-premises only)
- User distinguished name (On-premises only)
- User email address
- User principal name (Office 365 only)
These properties are stored to contact individual users (using the username and email address) and produce user Profile Update Pages (using the user identifier or object GUID).
Only AD objects with a valid mail property are scanned. This omits most service accounts and allows for more accurate analysis results. In environments with an on-premises AD instance, individual Organizational Units (OUs) can be targeted from Hyperfish settings to scope analysis to preferred OUs.
Since the Hyperfish Agent passes analysis results to the Hyperfish Cloud Service, everything the cloud service stores (other than Office 365 properties) is processed through the Hyperfish Agent:
- Object GUID
- User email address
Additionally, updated user properties that are sent down from the Hyperfish Cloud Service to commit to AD are passed through the Hyperfish Agent.
Installation and operation of the Hyperfish agent requires a constant internet connection. The Hyperfish on-premises agent utilizes the following outbound ports:
- 443 (HTTPS)--for API calls to authenticate the installation, check licenses, download configuration from the Hyperfish cloud service.
- A1.hyperfish.com 220.127.116.11
- A2.hyperfish.com 18.104.22.168
- A3.hyperfish.com 22.214.171.124
- A4.hyperfish.com 126.96.36.199
- 5671AMQP/S (TLS) for Hyperfish queue service
- Q01.hyperfish.com 188.8.131.52
- Q02.hyperfish.com 184.108.40.206
To verify communication with the service, a heartbeat ping is sent every five minutes from the Hyperfish Agent to the Hyperfish Cloud Service over HTTPS.
When configuration changes and profile updates are made through the Hyperfish cloud service, the change data, signed using a private certificate, is passed to a hosted message broker queue. The message, secured by Transport Layer Security (TLS), is passed to the Hyperfish Agent where the signature is verified. Finally, the agent updates its settings or commits changes to Active Directory.
In Transit Encryption
- HTTPS (HTTP over TLS) - Hyperfish secures all API communications over HTTPS, a TCP/IP protocol used by web servers to transfer web content securely. The data transferred is encrypted so that it cannot be read by anyone other than the recipient. The Hyperfish API earns an ‘A+’ rating from Qualys SSL Labs’ SSL Server Test, which assesses and provides a score for an endpoint’s protocol support, key exchange, and cipher strength.
- AMQPS (AMQP TLS) - Hyperfish uses message queuing with AMQPS or AMQP with TLS, a protocol that provides privacy and data integrity between two communicating applications. TLS is a widely deployed security protocol, used for any application that requires data to be securely interchanged over a network.
- Database service instances use full-volume encryption using the Linux Unified Key Setup (LUKS) specification.
- Database backup file encryption is performed using AES-256 in CTR mode with HMAC-SHA256 key algorithms.
Well-implemented managed services add the benefit of dedicated efforts on product reliability such as availability, and more importantly, security. Hyperfish uses hosted services when practical. These services include:
- Message queuing - hosted by CloudAMQP in Microsoft Azure
- Database - hosted by Aiven in Microsoft Azure
Hyperfish also utilizes Raygun (Mindscape) regarding real-time error reporting for on-premises and browser errors.
- When on-premises errors occur, the agent passes the time of the error, environment information (machine host name and amount of RAM), user ID (Hyperfish internal), context ID, and stack traces to Hyperfish over HTTPS.
- For browser errors, Raygun captures the time of the error, context ID, user ID, browser (e.g. Chrome, Firefox, Edge), and browser version.
Hyperfish is hosted software, developed by LiveTiles using Agile methodology. As such, the product is updated on a weekly basis. Hyperfish executes automated tests as well as manual testing for these weekly software updates.
The feature roadmap is managed solely by LiveTiles, but is populated with new features and capabilities primarily from customer and partner requests based on their business needs.
Product functionality tests are conducted by Hyperfish development and product management teams for any product enhancements being implemented, as well as for each weekly update. Testing verifies functional requirements, use cases, and that performance goals have been met.
All software development pertaining to Hyperfish is performed securely on-premises at Hyperfish headquarters in Kirkland, Washington, United States. Only the Hyperfish development team have access to the production environment.
Dedicated security efforts are one of the many reasons to leverage a cloud platform. The Hyperfish cloud service is built on the Microsoft Azure platform and shares the security benefits of hosting in Azure. For more information about Azure security, please refer to the Microsoft Azure Security documentation: https://www.microsoft.com/en-us/trustcenter/Security/AzureSecurity
All Hyperfish systems and data are made fully redundant. Daily backups are performed, and point-in-time recovery is available.
Intelligence enables organizations to make data-driven decisions on how to improve user experience by collecting usage analytics and reporting them back in a format that is easy to understand and action.
Intelligence does not collect/store users’ Personally Identifiable Information (PII). When a user accesses a page, LiveTiles Intelligence gives that user a unique ID based on their SharePoint User ID.
Interactions performed by a user are streamed to an Azure blob storage which is hosted by LiveTiles and LiveTiles customers can choose which Azure data center they would like to use to host their data. Data is streamed to an Azure Blob. From there an AzureFunction is run which aggregates the data into the LiveTiles SQL database. From here the UI accesses the information for display purposes.
The following is a list of events collected and measured in LiveTiles Design Intelligence: Web part loads; Page URL; Web part/tile clicks; Page loads; Browser used to access the page; Device used to access the page; Users' department; Page engagement score, a formula based on page loads, number of page events per session, number of sessions and number of unique user visits.
LiveTiles Bots is an enterprise grade web based chatbot platform built on Microsoft Bot Framework. From the web-based interface, users can create, manage, monitor and train bots using various out-of-the-box (OOTB) capabilities and integrations - all without requiring deep expertise in AI development concepts. Power users and developers can create complex Bot abilities and integrations that can be published to meet user needs across the organization.
LiveTiles Bots is a turnkey SaaS platform. This allows organizations to immediately start using LiveTiles Bots, without first having to provision or install any additional services in their existing environments. The platform runs on Microsoft Azure.
Microsoft Azure Subscription
To utilize the LiveTiles Bots platform your organization will need to have an active subscription to Microsoft Azure.
Azure Active Directory (AAD) is required for authentication (additional identity providers are on the roadmap). Bots can be deployed anonymously, not requiring user authentication / identification e.g. on a website or in contexts where it is necessary for the Bot to determine the identity of the user: eg integrating with an internal system.
Abilities - default and custom
The Bot platform comes with a series of default OOTB abilities that are typically dependent on your organization leveraging Microsoft Office 365 and having enabled the Microsoft Graph API. For more information on what Graph API permissions are required for the out-of-the-box abilities see this article
The Bot platform also comes with tooling that will enable you to develop your own custom abilities to fit the needs of your organization. These are discussed in the Customization section of this article.
Register Bot Service
A user with Tenant Administrator permissions (or anyone with permission to register an application in the customer Azure tenant) is required to authenticate initially as part of the deployment process. This allows the LiveTiles Bots services to be registered in your Azure tenant, which in turn allows users from that tenant to use these applications.
The Bots platform comes with abilities as part of the platform. If a customer develops their own abilities, these abilities will reside in the customer's tenant. Abilities are delivered in the form of an API.
One of the OOTB abilities is the “Flow” ability. This can either connect to “Flow” or “Logic Apps”, therefore these services must be enabled in the customer tenant.
If the user opts-in for conversation logging, a storage location needs to be specified where this data can be stored. If enabled, an endpoint needs to be provided to which the logged data can be streamed. This is needed to mitigate any confidentiality or security concerns around potentially sensitive data being stored in a LiveTiles tenant.
Deployment of QnA service (optional)
If the user is using the QnA ability, the QnA Azure service will need to be deployed into the customer tenant. Please refer to the LiveTiles Deployment Guide on how to deploy and enable this service
The LiveTiles Bots SaaS version has publicly registered endpoints - only an internet connection is needed from the user and any integrating system.
Required URLs to be whitelisted:
Other URLs may be required depending on the bot use case that will be implemented and the services that it needs to be integrated with.
LiveTiles Bots are designed to be simple to administer for the typical business user, below are some of the categories to get you started.
For a Getting Started guide on creating and managing Bots, see this article
Dependent on use case, a Bot can have a single ability or deliver multiple abilities. For more information on Bot Abilities, visit: https://support.livetiles.nyc/hc/en-us/articles/360037856191-Bot-Assistant-Terminology-
Communicating with a bot
There are several existing forms of communication with a Bot. Whether it be via a message, button or card, communicating with your Bot is effortless. For more info, visit: https://support.livetiles.nyc/hc/en-us/articles/360037489612-Bot-Assistant-Communicating-with-a-bot-
Bots can be accessed through distinct channels, including Webchat, Teams or Skype. For more information on Bot supported channels, visit: https://support.livetiles.nyc/hc/en-us/articles/360037498592-Bot-Assistants-Supported-Channels-Services-Apps
For our full suite of Bots Support Articles visit this link: https://support.livetiles.nyc/hc/en-us/sections/360006686931-Bot-Assistants
Updates to the SaaS service are rolled out every few weeks with new features and functionality. The goal is to never roll out any changes that could affect or break production Bots in use by users. In order to achieve this, we have a very structured approach to rolling out updates:
- The Development team completes internal testing and publishes an updated release to our internal test environment.
- The Quality Assurance team performs testing to confirm there are no performance issues with the new/updated functionality, and also test new/updated feature interaction with existing functionality.
- Once the QA team gives the OK, the Development team pushes the release to our Pre-Production environment. This environment is used by other internal LiveTiles teams for demonstrations as well as internal LiveTiles bots.
- After the release is live on the pre-production environment with desired and expected performance and behavior, it is then pushed out to our production environment.
Power users can extend the platform using the “Flow” ability. The Flow ability allows a designer to connect their Bots to their Flows. The Flow ability fetches information from the user and passes it off to Flow. For more information on how to create a Flow ability, visit: http://help.livetiles.nyc/livetiles-bots/bot-abilities/flow
LiveTiles Bots allow integration to any O365 service that is available through the Microsoft Graph API. This includes email, calendar, address book, OneDrive, etc.
The LiveTiles “Flow” ability enables Bot power users to integrate other systems via Microsoft Flow. Below is a diagram of the integration pattern using this ability.
Additional integration options are available through custom abilities that can be provided specific to the system that is required. These custom abilities will allow your Bot developers to design the ability interface that users will see in the LiveTiles Bot designer, and also the custom code that is run once the ability is used. Standard API development patterns are used and abilities are written in .NET.
The diagram below shows the logical relationship between LiveTiles, the customer's tenants, and the Microsoft cognitive services.
Azure services being used:
In-transit encryption is provided via SSL certificates which are owned by Microsoft as part of the App Service platform.
All messages are encrypted from the user to the Directline API. This is the entry point into the LiveTiles Bots environment. From here, messages are encrypted when passed between the different Microsoft services. How some of these services handle the messages is beyond LiveTiles' control.
TLS1.1+ in transit in HTTP-based communication.
Configuration encryption at rest
Sensitive fields are encrypted at rest in the databases. The life cycle of these keys is also controlled by Microsoft.
A Key Vault is used to store sensitive passwords and secrets. The life cycle of the keys used to encrypt these are controlled by Microsoft.
An administrative overview that will provide insight and documentation on the organization-level is currently in development. This will provide specified users the visibility of all Bots across their organization. This, in turn, will allow central governance and control of the Bots being built, as well as how they operate and which channels they can be published to.
Backup and restore
LiveTiles uses enterprise backup and restore policies with data replicated for geo-redundancy.
LiveTiles does not store any customer data.
LiveTiles uses Microsoft PaaS infrastructure designed in HA with geo-redundancy.
Monitoring is done with Azure Application Insights.