Back to Blog
March 26, 20269 min read

Connecticut Data Privacy Act and AI Training Disclosure

Connecticut matters when a product involves AI training, minors, chatbots, or location data.

Connecticut merits close review because the Connecticut Data Privacy Act follows the omnibus notice and rights model and the Attorney General has taken an active enforcement role.

A business using targeted advertising, processing children's data, collecting geolocation data, or using personal data to train large language models or similar systems should read Connecticut more closely than a generic omnibus summary allows.

The page and the surrounding workflow should account for the issues the Connecticut Attorney General has already identified as active enforcement or policy concerns.

The baseline notice is familiar, and it has to be complete

Connecticut expects the privacy notice to describe the categories of personal data processed, the purposes for processing, whether and why personal data is shared with third parties, and how consumers can exercise their rights. It also expects a functioning path for appeals when requests are denied. That is the baseline structure.

As with other omnibus laws, the drafting problem is solved through operating facts rather than abstract category lists. The business has to know what your site collects, what vendors receive it, what targeted advertising tools are active, and what the rights workflow looks like in practice. If those operating facts are unclear, the Connecticut page will drift toward generic statements that do not hold up once a request or complaint arrives.

Connecticut is also an enforcement signal state

The Connecticut Attorney General's office has used the law as more than a background disclosure statute. Its published reporting has highlighted work involving children's and teens' online safety, connected vehicles and geolocation data, gaming platforms, chatbots, and data brokers. That reporting shows where the office is already focusing attention.

For a business with youth facing products, location features, automated tools, or rich behavioral data, Connecticut should be reviewed as a live risk issue rather than as one more line in a fifty-state chart.

The AI training disclosure changes the conversation

Connecticut now stands out because the Attorney General has said recent amendments include a disclosure requirement when personal data is used to train large language models. That is a distinct issue from the standard privacy notice categories. A business can have a reasonably good rights table and underdisclose if it uses customer messages, uploads, behavioral data, or account information to train AI systems without saying so clearly enough.

This changes the review for businesses building AI features into support, recommendations, search, content generation, moderation, or internal tooling. The privacy page needs to describe the AI training use if the law requires it, and the internal data map needs to confirm whether that use is taking place.

Connecticut is a state where the product and your policy have to stay close together

The strongest Connecticut risk is the distance between the way the product works and the way your policy reads. If your site targets minors, collects precise location, relies on chatbots, or uses personal data in AI systems, those facts should drive the disclosure analysis from the start.

Connecticut forces your business to review the actual product features alongside the standard privacy clauses. A general privacy template can describe rights in a technically complete way and miss the issues Connecticut is most likely to surface.

What to review before publishing under Connecticut law

A Connecticut review should start with the product features and data uses that make the law more specific than a generic omnibus summary would suggest, then move into your policy and rights workflow.

  • Confirm the baseline categories, purposes, third party disclosures, rights path, and appeal path in the privacy notice
  • Review whether the product targets minors or processes geolocation or other sensitive data in a way that changes the analysis
  • Check whether personal data is used to train large language models or similar AI systems and whether that use needs to be disclosed
  • Make sure your policy describes the actual product and vendor stack instead of an abstract compliance model
  • Treat Connecticut as an active risk signal and a meaningful entry in a multistate chart

Key Takeaways

  • Connecticut requires the standard notice, rights, and appeal structure, and the Attorney General has taken an active enforcement posture.
  • The state now raises a distinct AI training disclosure issue for businesses that use personal data to train large language models or similar systems.
  • Youth facing products, geolocation features, chatbots, and data-rich products should treat Connecticut as a meaningful review state.
  • The page needs to match the real product features and data uses as well as the standard omnibus law checklist.

Primary Sources

Turn this into a real document

TermsBuilder uses an attorney-built questionnaire to turn these legal issues into Terms & Conditions and Privacy Policy pages that match the way your business operates.

Start your document set