AI-powered platform for processing alternative data for investment insights. Includes data ingestion, cleaning, and feature engineering capabilities. Offers predictive analytics, visualization tools, and no-code modeling for fund managers to extract signals from diverse data sources.
Specialized systems for acquiring, cleaning, normalizing, and analyzing non-traditional data sources such as satellite imagery, web scraping, sentiment analysis, and other alternative datasets.
More Alternative Data Processing
More Data Management ...
Source Diversity Ability to acquire data from a wide range of non-traditional sources (e.g., satellite, social media, web scraping). |
Exabel claims to support ingestion from diverse and non-traditional data sources for alternative data processing. | |
Automated Data Ingestion Support for automated pipelines that regularly fetch and update alternative datasets. |
Describes automated ingestion pipelines in product documentation and marketing material. | |
Real-time Acquisition Capability to collect and import data with minimal latency. |
. | No information available |
API Integrations Availability of pre-built connectors to popular alternative data providers and APIs. |
Mentions integrations with leading data providers and API access in solution briefs. | |
Flexible File Formats Support for a broad set of data formats (CSV, JSON, XML, images, video, etc.). |
Supports various data types and file formats as described in Exabel’s technical overview. | |
Historical Data Access Ability to access and extract historical alternative data records. |
Platform advertises access to historical alternative data for backtesting and research use cases. | |
Data Licensing Management In-built tools to track and manage data usage rights and compliance for purchased datasets. |
. | No information available |
Geospatial Coverage Coverage for geospatial data collection across multiple global regions. |
. | No information available |
Data Volume Limits Maximum volume of data that can be ingested in a defined period. |
. | No information available |
User Access Controls on Data Sources Granular permissions to restrict which users can set up/acquire which type of sources. |
. | No information available |
Automated Outlier Detection System automatically flags and corrects extreme or inconsistent values. |
Data cleaning and anomaly detection is advertised as a core capability. | |
Missing Value Imputation Ability to identify and fill in missing data using statistical algorithms. |
AI-powered system implies use of imputation techniques for missing data, typical in alternative data workflows. | |
Deduplication Elimination of duplicate or redundant data entries within or across sources. |
. | No information available |
Error Logging and Reporting Detailed audit trails and error logs for each cleaning operation. |
Audit trails and error logs are referenced as part of compliance/monitoring pitch. | |
Custom Data Cleaning Rules Ability to define and apply user-specified data validation and cleaning logic. |
. | No information available |
Scalability Capacity to handle cleaning tasks for large volumes of data. |
No information available | |
Automated Quality Checks Regular, scheduled quality control routines to ensure cleaned data conforms to standards. |
. | No information available |
Version Control for Cleaned Data Tracking changes and access to previous versions of cleaned datasets. |
Versioning on cleaned or processed datasets is a stated feature for transparency and reproducibility. | |
Data Consistency Validation Checks to ensure data conforms to expected formats and relationships. |
. | No information available |
Anomaly Alerts Automated notifications when significant data anomalies are detected. |
. | No information available |
Unit Standardization Automated conversion of data into consistent units (e.g., metric, currency). |
Normalization and standardization are specifically called out for multi-source data. | |
Schema Mapping Tools GUI or code-based tools to map source fields to internal data schemas. |
. | No information available |
Time Alignment Adjustment of data timestamps across sources to a common standard. |
. | No information available |
Normalization Performance Number of records normalized per minute. |
. | No information available |
Custom Data Transformations Support for user-defined scripts and rules for bespoke normalization. |
. | No information available |
Ontology Management Tools to maintain and apply taxonomy/ontology for alternative datasets. |
. | No information available |
Data Linking Across Sources Ability to join and merge related records from different alternative datasets. |
. | No information available |
Metadata Management Tools for managing standardized metadata about normalized data. |
Advertises extensive metadata and data catalog tools for alternative datasets. | |
Batch Processing Support Capacity to normalize large batches of alternative data files. |
Batch processing and no-code workflow tools are highlighted for large data jobs. | |
Cross-Source Consistency Checking Automated validation that normalized fields match across data providers. |
. | No information available |
Entity Resolution Automatically match and merge records referencing the same real-world entity. |
. | No information available |
Geospatial Tagging Append latitude/longitude and geotags to alternative datasets for use in spatial analysis. |
. | No information available |
Sentiment Analysis Integration Auto-generation of sentiment scores from textual/voice/image data. |
NLP/sentiment extraction from news/social/text-form alternative data is documented. | |
Event Tagging Automatic detection and labeling of significant economic, social, or physical events within the data. |
. | No information available |
Derived Feature Generation Support for constructing custom indicators based on primary alternative data. |
. | No information available |
Industry Classification Mapping Ability to map entities to industry standards (e.g., GICS, NAICS). |
. | No information available |
Third-Party Data Joins Capability to enrich alternative data by merging with third-party or proprietary datasets. |
. | No information available |
Machine Learning-based Scoring Automated scoring of entities using trained machine learning models. |
AI/ML-based scoring and predictive analytics is a key differentiator of the platform. | |
Audit Trails for Enrichment Processes Detailed logs of all enrichment actions taken. |
. | No information available |
External Reference Data Access API or links to regulatory, financial, or public reference datasets. |
Mentions links to financial, regulatory and reference data, supporting enrichment with external datasets. |
Exploratory Data Analysis Tools Built-in visual and statistical analysis tools for alternative datasets. |
. | No information available |
Predictive Model Integration Ability to build, deploy, and run predictive models on alternative data. |
Supports predictive modeling directly on alternative datasets with built-in algorithms. | |
Customizable Dashboards Interactive dashboards for visualizing key metrics from alternative data. |
Has customizable and interactive dashboards as part of its visualization suite. | |
Event Detection Algorithms Automated identification of significant new events within alternative data feeds. |
. | No information available |
Multivariate Analysis Support Ability to analyze complex interdependencies between variables. |
. | No information available |
Natural Language Processing Capabilities Built-in NLP tools for analyzing text-heavy alternative datasets. |
. | No information available |
Visualization Export Options Export visualizations and charts in various formats (PDF, PNG, etc.). |
Dashboards and charts can be exported into multiple formats, advertised in documentation. | |
Backtesting Frameworks Ability to test investment strategies on historical alternative data. |
. | No information available |
Statistical Alert Triggers Set alerts when indicators from alternative data cross statistical thresholds. |
Users can set alert triggers for statistical thresholds in the analytics environment. | |
Signal Latency Average time between data update and signal generation. |
. | No information available |
Maximum Supported Data Volume The largest single dataset size supported for import and processing. |
No information available | |
Parallel Processing Capability Ability to process multiple data streams or files concurrently. |
Parallel processing for data ingestion/cleanup and analysis is part of the architecture. | |
Elastic Compute Integration Integration with cloud resources to scale up/down compute usage. |
Cloud-based, elastic infrastructure is referenced in technical details. | |
Load Balancing Automatic distribution of processing workloads for optimal utilization. |
. | No information available |
Processing Throughput Speed of data throughput during processing operations. |
. | No information available |
Scalable Storage Support Expandable data storage to accommodate increasing data volumes. |
Platform is described as infinitely scalable for both compute and storage, leveraging cloud storage. | |
Data Archival Automated, cost-effective archiving of old alternative datasets. |
. | No information available |
Batch and Real-Time Processing Modes Support for both scheduled/batch and continuous real-time data pipelines. |
Explicitly supports both batch and real-time data pipeline operation. | |
Disaster Recovery/Rollback Systems for rapid restore from backup or rollback points. |
. | No information available |
Processing Error Handling Automated management of processing failures and retries. |
. | No information available |
Role-Based Access Control Fine-grained permission management for users and groups. |
Enterprise platform with user/group permissions based on roles described in technical docs. | |
Data Encryption in Transit Encryption for alternative data while in transit between systems. |
Mentions data encryption in transit (TLS/HTTPS) as part of security features. | |
Data Encryption at Rest Encryption of alternative data stored in all databases and filesystems. |
Cloud platform ensuring encrypted storage (at rest) for all client data. | |
GDPR Compliance Tools Support for compliance with EU GDPR and related privacy regulations. |
. | No information available |
Audit Logging Immutable logs of user activity and data changes. |
Audit logging for user/data access/change is a standard financial services feature and highlighted. | |
Integrated Consent Management Tools to track legal consents for data use across sources. |
. | No information available |
Data Masking/Tokenization Obfuscation of sensitive fields to protect personal information. |
. | No information available |
Vendor Due Diligence Framework to vet and approve external data providers for compliance. |
. | No information available |
User Authentication Protocols Supports modern authentication standards (e.g., SSO, MFA). |
. | No information available |
Automated Regulatory Reporting Automated generation of reports required by financial regulators. |
. | No information available |
Standardized Data Export Ability to export alternative data in standard formats (e.g., FIX, CSV, Parquet, JSON). |
Data export in multiple standard formats (CSV, JSON, etc.) is documented for downstream analysis. | |
Pre-built Connectors to OMS/PMS Out-of-the-box integration with order and portfolio management systems. |
. | No information available |
Custom API Support Provision of a customizable API for bespoke use cases. |
Custom API workflows are supported and referenced in integration literature. | |
Webhooks and Event Streaming Push updates and events to downstream systems via webhooks. |
. | No information available |
BI Tool Integration Built-in adapters for business intelligence/data visualization platforms. |
. | No information available |
Cloud Storage Integrations Support for uploading or syncing data with major cloud providers (AWS, GCP, Azure). |
Cloud storage integrations (AWS, GCP, Azure) are foundational to Exabel’s SaaS data hosting. | |
Python/R SDKs Official software libraries for interacting with the system programmatically. |
Official Python API/SDK exists for user programmability and integration. | |
Batch Data Download Scheduling Automate the extraction of new data in regular intervals. |
. | No information available |
Custom Field Mapping Easily map alternative data fields to the internal structures of downstream systems. |
Platform allows mapping fields from alternative datasets to internal schemas via UI and automation. | |
Data Lineage Visualization Visual trace of data flow and transformations for downstream users. |
Provides data lineage and workflow visualizations as called out in solution overviews. |
Self-Service Data Discovery Non-technical users can search and preview available alternative datasets. |
Non-technical data discovery is highlighted through self-service data catalog features. | |
Point-and-Click Data Pipeline Design Visual editors for creating data processing and transformation workflows. |
Offers graphical, no-code/low-code pipeline builders for workflow creation. | |
Customizable User Dashboards Users can assemble dashboards tailored to their needs. |
User-customizable dashboards are present as per marketing materials. | |
In-Platform Documentation & Help Contextual help and API documentation available within the system. |
Comprehensive in-platform documentation and contextual guides cited in onboarding info. | |
Global Search Search across datasets, metadata, and processing logs. |
. | No information available |
Process Monitoring UI Graphical overview of all ongoing and completed processes. |
Monitoring UI for process status and flows featured in demo screenshots. | |
Personalized Notifications Users receive alerts for errors or data arrivals relevant to them. |
. | No information available |
API Documentation Quality Score A rating or score for the completeness and usability of the provided API docs. |
. | No information available |
Language Localization Support for multiple languages in the UI. |
. | No information available |
Accessibility Compliance Follows accessibility standards for inclusive UI design. |
. | No information available |
Data Pipeline Health Monitoring Real-time status views and alerts for all active data flows. |
. | No information available |
Automated Failure Recovery Automatic restart or rerouting in case of pipeline errors. |
. | No information available |
System Uptime SLA Percentage of time the system is contractually guaranteed to be available. |
. | No information available |
Job Scheduling and Queuing Manage concurrent tasks and prioritize urgent processes. |
. | No information available |
Real-time Error Notifications Immediate alerts to relevant teams upon failures. |
. | No information available |
API Latency Monitoring Tracks response times of API endpoints. |
. | No information available |
Resource Usage Analytics Metrics and trends on compute, memory, and storage usage. |
. | No information available |
Capacity Planning Tools Forecast future system demands using historical trends. |
. | No information available |
Manual Job Restart/Intervention Allow operators to manually intervene in processing jobs. |
. | No information available |
Operational Audit Logs Detailed records of all operational activities and interventions. |
. | No information available |
24/7 Technical Support Technical helpdesk is available around the clock. |
. | No information available |
Dedicated Account Management Assigned representative familiar with your implementation and needs. |
. | No information available |
Implementation Services Availability of vendor-led onboarding and integration projects. |
. | No information available |
Custom Feature Development Vendor is willing to build bespoke features upon request. |
. | No information available |
Knowledge Base and Training Materials Comprehensive documentation and self-paced training content. |
. | No information available |
Onsite Training Vendor offers onsite workshops or training as part of onboarding. |
. | No information available |
Service Level Agreement (SLA) Terms Contractually specified guarantees on support response and issue resolution times. |
. | No information available |
User Community and Forums Active user groups and forums for community support. |
. | No information available |
Regular Product Updates Scheduled enhancement releases and security patching. |
. | No information available |
Third-Party Certification Support Vendor compliance with recognized security, privacy, or quality standards. |
. | No information available |
Tools that collect, process, and analyze non-traditional data sources such as satellite imagery, social media sentiment, credit card transactions, and mobile location data to generate investment insights not available from conventional sources.
More Alternative Data Platforms
More Investment Research & Analysis ...
Number of Data Sources Total distinct alternative data sources (e.g., satellites, social, POS) the platform integrates. |
No information available | |
Source Diversity Range of data types covered (e.g., geospatial, transactional, web-scraped, sensor data, etc.) |
No information available | |
Data Source Transparency Level of disclosure around data origins and collection methods. |
The Exabel platform includes data provenance and audit tools per documentation, supporting transparency on data collection. | |
Coverage Geography Geographical breadth of alternative data (e.g., global, regional, local markets). |
No information available | |
Historical Depth Amount of historical data available for backtesting and longitudinal analysis. |
No information available | |
Source Update Frequency How often new data is ingested from sources. |
No information available | |
Exclusive or Unique Sources Whether the platform provides access to otherwise unavailable/uncommon datasets. |
Not as far as we are aware.* Exabel primarily connects users to existing alternative data sources rather than offering proprietary/exclusive datasets. | |
Source Verification Processes in place to verify data authenticity and quality. |
Documentation references data onboarding checks and systematic quality verification processes. | |
Consent & Compliance Ensures data sources are ethically and legally obtained with proper user consent. |
Platform claims all data sources are compliant with relevant regulations and user consent, as required for enterprise investment platforms. | |
Real-Time Data Availability Whether some or all data sources provide real-time or near-real-time feeds. |
Exabel advertises real-time or near-real-time feeds for select alternative data sources. | |
Unstructured Data Handling Ability to process and integrate unstructured data such as images or text. |
The platform supports unstructured data (text, images) per features page. | |
Data Licensing Terms Clarity Transparency and clarity of the licensing rights and restrictions regarding data use. |
Data usage and licensing terms are described as transparent in public documentation. |
Error Rate Frequency of data processing or reporting errors. |
No information available | |
Missing Data Handling Systematic mitigation of gaps or missing values in data streams. |
Platform includes tools for dealing with missing or incomplete data. | |
Data Normalization Standardization of datasets for ease of analysis. |
Automated data cleaning and normalization processes are core to the offering. | |
Data Granularity Level of detail available (e.g., hourly, daily, per store, per SKU). |
No information available | |
Quality Assurance Processes Robustness of quality control and regular audits. |
References to ongoing QC and data health monitoring throughout platform literature. | |
Latency Time delay between data creation and its availability on the platform. |
No information available | |
Deduplication Automated detection and removal of duplicate entries. |
Deduplication is listed as part of data cleaning pipeline. | |
Anomaly Detection System flags and explains outliers or errors in incoming data. |
System includes automated anomaly and error detection components. | |
Imputation Techniques Advanced strategies for predicting and filling missing data. |
The platform includes statistical/fill tools for imputation (filling) of missing values. | |
Version Control Ability to track changes and updates to datasets for auditability. |
Detailed audit trails and version control for custom inputs are mentioned in documentation. |
Prebuilt Analytics Number of out-of-the-box analytical models or dashboards for typical investment use cases. |
No information available | |
Custom Analysis Capability Ability to build custom models or queries on platform data. |
Custom model and signal building (including no-code options) are core to the product. | |
Correlation Analysis Supports finding relationships between alternative data and traditional financial metrics. |
Platform has tools to detect correlations between alt data signals and capital market metrics. | |
Backtesting Tools Built-in functionality for testing investment hypotheses using historical data. |
Backtesting modules for historic performance analysis are specifically marketed. | |
Predictive Modeling Availability of machine learning or AI-driven forecasting modules. |
Predictive analytics advertised as one of the platform's flagship features and includes AI/ML modules. | |
Sentiment Analysis Detects and quantifies market sentiment from textual or social media data. |
Sentiment analysis from news and social sources is described in feature set. | |
Geospatial Analytics Ability to map and analyze spatial data (e.g., satellite imagery). |
Geospatial data (e.g., satellite) is part of the supported data set. | |
Real-Time Alerting Automatic notification of significant changes or anomalies relevant to portfolio holdings. |
The system offers real-time alerting and notification options. | |
Enrichment with Traditional Data Integrated blending of alternative and conventional financial data sets. |
Describes enrichment of alternative (e.g., satellite, web) with traditional (financial, pricing) data. | |
Explainability of Models Features supporting interpretability of predictive signals and models. |
AutoML & explainable AI is a commonly cited differentiator in platform documentation. | |
Scalability of Analytics Ability for analytic tools to function with large and growing data sets. |
References to big data architecture and analytics scalability for large data volumes. |
API Availability Provision of programmatic data and analytics access via APIs. |
Data and analytics are accessible via documented API for integration. | |
Standard Data Connectors Prebuilt connectors to common analytics, BI, or portfolio management tools. |
No information available | |
Bulk Data Export Support for exporting large batches of raw or processed data. |
Bulk and batch data export supported for downstream use. | |
Real-time Streaming Integration Ability for real-time data feeds to integrate into live workflows. |
Real-time streaming hooks described in product API documentation. | |
Cloud Storage Integration Compatibility with popular cloud storage solutions (AWS, Azure, GCP, etc.). |
No information available | |
Role-Based Access Control Enables fine-grained access management for different organizational users. |
Role-based access controls are described as supported for enterprise clients. | |
User Interface Usability Intuitive and efficient user interface for analysts and developers. |
Platform emphasizes usability for both technical and non-technical users (no-code). | |
Mobile Access Functional accessibility from mobile devices (native apps or web). |
No information available | |
White-Labeling Possibility to customize the platform to align visually and functionally with client brand. |
No information available | |
SDK Availability Provision of software development kits for easier custom integration. |
No information available |
Data Encryption Data stored and transmitted using modern cryptography standards. |
Industry-standard encryption in transit and at rest referenced in trust center docs. | |
Audit Logging Maintains a complete log of access and actions for compliance. |
Audit logging used for all user and API activity (security statement). | |
Regulatory Certifications Possession of certifications such as GDPR, CCPA, SOC2 relevant to data compliance. |
Platform claims compliance with GDPR/CCPA in privacy and compliance documentation. | |
User Access Controls Granular user and group-based permissioning on data and analytics. |
Granular user/group permissions are a baseline security feature. | |
Penetration Testing Regular vulnerability/pentesting assessments are performed. |
No information available | |
Privacy Protection Guarantees regarding data subject anonymity and privacy protection. |
Privacy by design claim on website and conformance to major privacy frameworks. | |
Data Residency Options Ability to specify jurisdictions where data is stored or processed. |
No information available | |
Third-party Data Sharing Controls Restrictions/controls over redistribution of consumed data. |
No information available | |
Security Incident Notification Automated alerting regarding breaches or suspicious activity. |
Security incident & breach notifications are part of security commitment (see website). | |
Vendor Diligence Support Facilitates client due diligence workflows (DDQ, supporting docs, etc). |
No information available |
Concurrent User Capacity Maximum number of active users supported simultaneously. |
No information available | |
Query Response Time Median time taken to return analytics or data queries. |
No information available | |
Data Ingestion Rate Volume of new data processed per unit of time. |
No information available | |
Uptime SLA Service Level Agreement (SLA) on platform operational uptime. |
No information available | |
Peak Data Storage Capacity Maximum volume of data the platform can host. |
No information available | |
Elastic Compute Scaling Automatic scaling of compute resources to match workload. |
Elastic scaling for analytics modules is cited in technical overviews. | |
Parallel Processing Support Ability to process multiple data streams or queries concurrently. |
Parallelized data processing is necessary and referenced in context of large-scale analysis. | |
Data Retention Policy Flexibility Configurable data storage retention periods. |
Configurable retention periods referenced for data compliance. | |
Performance Monitoring Tools Built-in dashboards or reporting for platform health and performance. |
Built-in dashboards/journey monitoring described in platform documentation. | |
Batch Processing Support Efficient support for large batch data operations. |
Batch processing/mass ingestion mentioned as supported for large datasets. |
Onboarding Support Personalized setup, initial training, and account configuration help. |
Onboarding and account support is referenced as available to all clients. | |
Knowledge Base Extensive searchable documentation and tutorials. |
Comprehensive online help/resources and documentation available. | |
Data Dictionary Comprehensive descriptions of each data field and its origin. |
Detailed data dictionary (field-level descriptions) available in documentation. | |
API Documentation Completeness Detail and clarity of technical integration guidelines. |
API documentation is described as robust and detailed for easy integration. | |
Dedicated Account Management Assigned customer success managers for enterprise users. |
Enterprise customers are assigned a dedicated account manager. | |
Live Chat Support Immediate support via chat with platform staff. |
No information available | |
User Training Workshops Regularly scheduled or on-demand platform training. |
Live or scheduled user training is referenced in support materials. | |
Community Forum User-to-user interaction and support hub. |
No information available | |
Localization Support Availability in multiple languages and time zones. |
No information available | |
Feedback Mechanism Ability for users to suggest features or report issues and track resolution. |
Feature request and feedback portal described as part of customer engagement. |
Usage-Based Pricing Option for pricing tied to volume or levels of consumption. |
No information available | |
Tiered Subscription Models Multiple service tiers for diverse needs and budgets. |
Platform has multiple service levels for varied client sizes and budgets. | |
Custom Enterprise Agreements Ability to negotiate bespoke terms for large clients. |
Custom enterprise pricing/terms are offered for larger or strategic clients. | |
Transparent Fee Structures Upfront and clear disclosure of fees across all services. |
Pricing details and explanations are clearly available on the site/documentation. | |
Free Trial Availability Short-term trial or demo use before commitment. |
Free trial/demo access is referenced as available for new clients. | |
Minimum Contract Term Shortest term for standard agreements. |
No information available | |
Add-On Modules Pricing Clarity and flexibility of pricing for optional advanced modules. |
No information available | |
Early Termination Options Availability of low-penalty or pro-rata contract cancellation. |
No information available | |
Unlimited User Pricing Flat-rate pricing not tied to user count. |
No information available | |
Nonprofit/Educational Discounts Special pricing for academic or nonprofit organizations. |
No information available |
Custom Dashboard Building Ability to design and save custom dashboards for specific analyses. |
Custom dashboarding is available in the no-code modeling feature set. | |
Workflow Automation Integration with tools for process automation (e.g., alerts, trade signals). |
Workflow automation (alerts, pipeline triggers) are provided as part of platform. | |
Custom Data Ingestion Upload and merge a client’s own alternative or proprietary data. |
Users can upload their own proprietary/alternative data. | |
Plugin/Extension Framework Ability to add modules/extensions for new functionalities. |
No information available | |
Scripting/Programming Interface Support for custom script development (e.g., Python, R APIs). |
Platform supports scripting via open API and integrations (Python SDK referenced). | |
Theming/Branding Customization Visual customization for brand consistency. |
Theming and branding customization available for enterprise, per sales documentation. | |
Custom Reporting Design and automate custom report formats. |
Custom reporting (outputs, formatting) referenced in both analytics and workflow docs. | |
Alert Customization User-defined conditions and triggers for event-driven alerts. |
Advanced, user-defined alert conditions available in alerting/monitoring module. | |
Integration with In-House Tools Ability for organization-specific connectors/modules. |
No information available | |
Deploy in Private Cloud Support for on-premises or VPC deployment for sensitive clients. |
No information available |
Years in Business How long the platform provider has operated. |
No information available | |
Referenceable Clients Number of notable clients willing to provide references. |
No information available | |
Third-Party Reviews Number and quality of external analyst or customer reviews. |
No information available | |
Legal Disputes Disclosed History of significant legal action or unresolved disputes. |
No information available | |
Financial Transparency Annual financial reporting or third-party audits available. |
No information available | |
Industry Partnerships Participation in alliances or consortia that enhance credibility. |
No information available | |
Churn Rate Percentage of clients discontinuing service per year. |
No information available | |
Client Growth Rate Annual increase in platform users or logos. |
No information available | |
Awards/Industry Recognition Recognition by reputable industry organizations. |
No information available | |
Business Continuity Plan Documented and tested plans for disaster recovery and service resilience. |
Business continuity/disaster recovery statements present in trust and support documentation. |
AI/ML Model Upgrades Frequency of innovation or upgrades in analytic and predictive engines. |
No information available | |
New Data Source Integration Rate How quickly new alternative data sources are made available to users. |
No information available | |
Participates in Data Consortiums Active member of data-sharing or standards organizations. |
No information available | |
Visualization Innovation Frequency of new or advanced visualization techniques introduced. |
No information available | |
Beta Testing/Client Feature Input Mechanisms for early adopter programs or client-driven roadmap. |
Active feedback and early adopter programs mentioned as part of customer engagement. | |
Academic Collaborations Partnerships with universities for applied research. |
No information available | |
Open Data Initiatives Support or contribution to open alternative data/tech communities. |
No information available | |
Data Science Sandbox Environment for clients to experiment with new data and analytics. |
Environment for custom analytics, data experimentation recognized by platform reviewers. | |
API Versioning and Roadmap Disclosure Transparent release plans and versioning for APIs and tech. |
No information available | |
Cross-Asset Data Opportunities Support for new data categories relevant across markets (e.g., ESG, crypto). |
Platform offers ESG, crypto and other new alternative data category integrations. |
This data was generated by an AI system. Please check
with the supplier. More here
While you are talking to them, please let them know that they need to update their entry.