Quest Data Modeler AI – Overview by Category
The following table provides a high‑level overview of how Quest Data Modeler AI is organized and what each category delivers. Detailed questions and answers follow in the FAQ section.
|
Category |
What QDM Delivers |
|---|---|
|
AI Platform |
AI capabilities powered by Azure OpenAI (GPT‑5.1), hosted on Microsoft Azure. |
|
Modeling Capabilities |
AI‑assisted generation and modification of logical and physical data models, including entities, attributes, and relationships. |
|
Data Processing Scope |
Processes only data required for data modeling workflows, including user prompts, limited model context, and execution metadata. |
|
Data Storage |
Persists conversation data, execution records, and model snapshots to support multi‑turn interactions and rollback. |
|
Privacy Protection |
Ensures user data is not used for AI training, advertising, or profiling, and supports user‑initiated deletion. |
|
Security Controls |
Protects AI services using authentication, authorization, encryption in transit, and infrastructure‑level encryption at rest. |
|
Data Isolation |
Enforces strict per‑user and per‑catalog access so customer data is not visible to other users or tenants. |
|
Compliance Support |
Aligns with GDPR, CCPA, and Microsoft’s Data Protection Addendum (DPA) through data minimization and isolation. |
|
Governance and Auditability |
Logs AI interactions and execution records to support audits, troubleshooting, and compliance reviews. |
|
Operational Safeguards |
Applies validation, approval workflows, rate limiting, and rollback mechanisms to reduce AI‑related risks. |
|
Service Limitations |
Applies defined rate limits, scope restrictions, and requires online connectivity and human review of outputs. |
Quest Data Modeler AI – Security, Privacy, and Compliance Frequently Asked Questions (FAQs)
What large language model does Quest Data Modeler use for AI features?
Quest Data Modeler uses Azure OpenAI (GPT‑5.1).
Where is the AI model hosted and deployed?
The AI model is hosted on Microsoft Azure, specifically through the Azure OpenAI Service.
What are the terms of use for AI capabilities in Quest Data Modeler?
The AI backend is built on the Microsoft Azure OpenAI Service. Its usage is governed by:
-
Microsoft Azure OpenAI Service Terms
-
Quest Software product terms for Quest Data Modeler
What types of user and system data are sent to Quest when using AI in Quest Data Modeler?
The following data may be processed:
-
User’s natural-language message (prompt sent to the LLM)
-
Conversation history (multi-turn context)
-
Data model context (enables reasoning about the current model)
-
Execution plan context (used during planning and execution phases)
-
File attachments for multimodal inputs, including Text, CSV, Excel, and JSON files
-
System prompt (role definitions and safety instructions)
-
Session metadata for routing and correlation (not sent directly to the LLM)
For what purposes is user data collected and processed by Quest Data Modeler AI?
User data is collected and processed exclusively for:
-
Data model generation
-
Data model modification
-
Intent classification
-
Requirement validation
-
Conversational assistance
-
Execution summarization
-
Conversation memory for multi-turn interactions
No data is collected for advertising, profiling, or unrelated purposes.
Is user input data stored when using AI features in Quest Data Modeler, and if so, for how long?
Yes. User input is stored in a PostgreSQL database as part of the multi-turn conversation architecture. Stored data includes:
-
User messages
-
Assistant responses
-
Conversation history
-
Agent execution records
-
Data model snapshots
Retention Policy: Data persists until explicitly deleted by the user, such as by deleting a chat session.
What additional data is collected if users submit feedback through AI features in Quest Data Modeler?
No additional data is collected, as there is currently no dedicated feedback mechanism for AI responses.
How is authentication handled for AI features in Quest Data Modeler?
All AI service endpoints are secured using stateless JWT-based authentication. Users must log in via Quest erwin Mart Portal to access AI services.
How is data encrypted while in transit when using AI in Quest Data Modeler?
Data in transit is protected through:
-
HTTPS communication with Azure OpenAI
-
HTTP Strict Transport Security (HSTS)
-
TLS encryption via the API Gateway
How is data protected and encrypted at rest for Quest Data Modeler AI services?
Encryption at rest is managed at the infrastructure level:
-
PostgreSQL: Stores conversation data, chat messages, and model snapshots
-
Redis: Stores rate-limiting state
Azure-managed services such as Azure Database for PostgreSQL and Azure Cache for Redis support encryption at rest by default.
Where does data processing for Quest Data Modeler AI take place geographically?
Processing occurs on Microsoft Azure. The specific region depends on where the Azure OpenAI resource is deployed. Application services run on Azure Kubernetes Service (AKS).
Can AI features in Quest Data Modeler be used with offline licenses?
No. AI features require active internet connectivity.
Is user data used to train or fine-tune AI models used by Quest Data Modeler?
No. User data is not used to train or fine-tune AI models. Azure OpenAI also does not use customer data for model training.
Where does the data entered by users into Quest Data Modeler AI get processed?
All processing occurs within Microsoft Azure’s secure cloud infrastructure.
Is AI enabled by default in Quest Data Modeler and how can it be disabled?
Yes. AI is enabled by default. It can be disabled by:
-
Removing or stopping the AI-service microservice, or
-
Configuring the API Gateway to block AI-service endpoints.
Can other Quest customers access or view my data entered into Quest Data Modeler AI?
No. Quest enforces strict per-user and per-catalog data isolation.
Does Quest collect usage metrics for AI features in Quest Data Modeler?
Yes. The service collects per-request metrics, which are logged locally and stored in the database
Is all AI-related data in Quest Data Modeler encrypted?
Yes. Data in transit is encrypted using HTTPS and TLS. Azure services provide encryption at rest.
What security controls are in place to protect the AI models used by Quest Data Modeler?
Security measures include:
-
API key authentication
-
JWT authentication
-
Role-based access control (RBAC)
-
Rate limiting
-
HSTS enforcement
-
Input validation
-
System prompt safety rules
Does Quest Data Modeler AI comply with data protection regulations such as GDPR and CCPA?
Yes. Azure OpenAI complies with Microsoft’s Data Protection Addendum (DPA), supporting GDPR and related privacy regulations.
Have AI-related risks been identified and mitigated for Quest Data Modeler?
Yes. Mitigations include:
-
Prompt injection protection
-
Human-in-the-loop approval workflows
-
Validation and retry mechanisms
-
Rate limiting
-
Snapshot-based rollback
What are the known limitations of AI features in Quest Data Modeler?
Known limitations include:
-
Rate and token limits
-
Requirement for internet connectivity
-
Limited supported file types
-
Potential inaccuracies requiring user review
-
Scope restricted to data modeling tasks
What URLs and network endpoints are required to support AI features in Quest Data Modeler?
Azure OpenAI Endpoint:
https://dm-ai-api.openai.azure.com
Does the AI chatbot use session data to train or improve the underlying language model?
No. Session data is used only to maintain conversation context and is not used for model training or improvement.
Is chatbot usage audited or logged for security and compliance purposes?
Yes. All AI interactions are logged and audited for security, monitoring, and compliance purposes.