Skip to main content

Getting Started with Cirvia Parental SDK

Welcome! The Cirvia Parental Android SDK provides intelligent content moderation powered by AI to help protect children online. In just a few minutes, you'll have real-time content monitoring integrated into your app.

What You'll Build​

By the end of this guide, your Android app will:

  • ✅ Monitor text content across your platform
  • ✅ Analyze images for inappropriate material
  • ✅ Automatically report incidents to parent dashboards
  • ✅ Provide risk scores for flagged content

Prerequisites​

Before you begin, ensure you have:

  • Android Studio (latest version recommended)
  • Minimum SDK: API Level 21 (Android 5.0)
  • Google Play Services configured in your project
  • Internet permission in your AndroidManifest.xml

Quick Integration (5 minutes)​

Step 1: Add Dependencies​

Add these to your app's build.gradle:

implementation 'com.squareup.okhttp3:okhttp:4.9.3'
implementation 'com.google.code.gson:gson:2.8.9'
implementation 'com.google.android.gms:play-services-auth:20.4.1'

Step 2: Add Permissions​

Add to your AndroidManifest.xml:

<uses-permission android:name="android.permission.INTERNET" />

Step 3: Initialize the SDK​

In your main activity:

import com.cirvia.parentalai.ParentalAI;
import com.cirvia.parentalai.config.ParentalAIConfig;

// Configure the SDK
ParentalAIConfig config = new ParentalAIConfig(
"your-api-key", // Your API key
"https://parentalai-production.up.railway.app/api/v1/ingest/send", // Ingest endpoint
"https://parentalai-production.up.railway.app/api/auth/login", // Auth endpoint
true // Enable debug logging
);

// Initialize with user consent
ParentalAI.init(this, config,
() -> {
// Success - SDK ready to use
Log.d("MyApp", "Cirvia Parental initialized successfully");
},
() -> {
// User declined or initialization failed
Log.d("MyApp", "Cirvia Parental initialization declined");
}
);

Step 4: Monitor Content​

// Monitor text content
ParentalAI.sendTextIncident("your-platform-name", userMessage);

// Monitor images (base64 encoded)
ParentalAI.sendImageIncident("your-platform-name", base64ImageString);

What Happens Next?​

  1. Content Analysis: Your content is analyzed by our AI moderation system
  2. Risk Assessment: Each incident receives a risk score (1-10)
  3. Parent Notification: High-risk content triggers parent dashboard alerts
  4. Dashboard Access: Parents can view, manage, and escalate incidents

Platform Examples​

Roblox Integration​

// When user posts in chat
String chatMessage = userInput.getText().toString();
ParentalAI.sendTextIncident("roblox", chatMessage);

Discord Bot Integration​

// When message received
@Override
public void onMessageReceived(MessageReceivedEvent event) {
String content = event.getMessage().getContentRaw();
ParentalAI.sendTextIncident("discord", content);
}

Social Media App​

// When user uploads image
String base64Image = convertImageToBase64(userImage);
ParentalAI.sendImageIncident("my-social-app", base64Image);

Next Steps​

Now that you have basic integration working:

  1. Complete Installation Guide → - Detailed setup instructions
  2. Authentication Setup → - Configure Google OAuth
  3. API Reference → - Full SDK documentation
  4. Platform Integration Guide → - Platform-specific examples

Getting Help​


Ready to protect children online? Continue to the Installation Guide for detailed setup instructions.