restructure
Some checks failed
CI - Multi-Platform Native / Build iOS (RSSuper) (push) Has been cancelled
CI - Multi-Platform Native / Build macOS (push) Has been cancelled
CI - Multi-Platform Native / Build Android (push) Has been cancelled
CI - Multi-Platform Native / Build Linux (push) Has been cancelled
CI - Multi-Platform Native / Build Summary (push) Has been cancelled

This commit is contained in:
2026-03-30 16:39:18 -04:00
parent a8e07d52f0
commit c2e1622bd8
252 changed files with 4803 additions and 17165 deletions

View File

@@ -1,263 +0,0 @@
# RSSuper - Multi-Platform Native Build System
This directory contains the build infrastructure for building RSSuper natively across multiple platforms.
## Architecture
```
native-route/
├── ios/ - iOS/macOS app (Swift/SwiftUI)
│ └── RSSuper/
│ ├── RSSuper.xcodeproj/
│ ├── RSSuper/
│ ├── RSSuperTests/
│ └── RSSuperUITests/
├── android/ - Android app (Kotlin/Jetpack Compose)
│ └── (auto-generated on first build)
├── linux/ - Linux app (C/Vala + GTK4)
│ └── (auto-generated on first build)
└── windows/ - Windows app (TODO)
```
## Quick Start
### Build All Platforms
```bash
# From project root
./scripts/build.sh
```
### Build Specific Platform
```bash
# iOS/macOS only
./scripts/build.sh -p ios
# Android only
./scripts/build.sh -p android
# Linux only
./scripts/build.sh -p linux
# Multiple platforms
./scripts/build.sh -p ios,android
```
### Build Types
```bash
# Debug build (default)
./scripts/build.sh -t debug
# Release build
./scripts/build.sh -t release
```
### Run Tests
```bash
# Build and test all platforms
./scripts/build.sh --test
# Test specific platform
./scripts/build.sh -p ios --test
```
### Clean Build
```bash
# Clean all platforms
./scripts/build.sh -a clean
# Clean specific platform
./scripts/build-ios.sh clean
./scripts/build-android.sh clean
./scripts/build-linux.sh clean
```
## Individual Platform Scripts
### iOS/macOS
```bash
# Build
./scripts/build-ios.sh [Debug|Release] [iOS|macOS] [destination] [action]
# Examples
./scripts/build-ios.sh # Debug iOS
./scripts/build-ios.sh Release # Release iOS
./scripts/build-ios.sh Debug iOS "platform=iOS Simulator,name=iPhone 15"
./scripts/build-ios.sh clean
./scripts/build-ios.sh Debug iOS "generic/platform=iOS" test
```
### Android
```bash
# Build
./scripts/build-android.sh [debug|release] [assemble|build|test|clean]
# Examples
./scripts/build-android.sh # Assemble debug
./scripts/build-android.sh release # Assemble release
./scripts/build-android.sh debug test # Run tests
./scripts/build-android.sh clean # Clean
```
### Linux
```bash
# Build
./scripts/build-linux.sh [debug|release] [build|install|test|clean|setup]
# Examples
./scripts/build-linux.sh # Build debug
./scripts/build-linux.sh release # Build release
./scripts/build-linux.sh debug setup # Setup build environment
./scripts/build-linux.sh debug install-deps # Install dependencies
./scripts/build-linux.sh debug run # Build and run
./scripts/build-linux.sh clean # Clean
```
## Platform-Specific Details
### iOS/macOS
- **Language**: Swift
- **UI Framework**: SwiftUI
- **Build System**: Xcode/xcodebuild
- **Minimum Deployment**: iOS 16.0+
- **Features**:
- SwiftUI for declarative UI
- Combine for reactive programming
- Core Data for persistence
- Background fetch for feed updates
### Android
- **Language**: Kotlin
- **UI Framework**: Jetpack Compose
- **Build System**: Gradle
- **Minimum SDK**: 24 (Android 7.0)
- **Features**:
- Jetpack Compose for modern UI
- ViewModel + LiveData for state management
- Room for local database
- Retrofit for networking
### Linux
- **Language**: C + Vala
- **UI Framework**: GTK4 + Libadwaita
- **Build System**: Meson + Ninja
- **Dependencies**:
- GTK4
- Libadwaita
- SQLite3
- libxml2
- libsoup-3.0
- **Features**:
- Native Linux look and feel
- GNOME integration
- System tray support
- Desktop notifications
## CI/CD
GitHub Actions workflow is configured in `.github/workflows/ci.yml`:
- **iOS**: Builds on macos-15 runner
- **Android**: Builds on ubuntu-24.04 runner
- **Linux**: Builds on ubuntu-24.04 runner
### Workflow Features
- Automatic builds on push/PR
- Manual trigger with configurable options
- Build artifacts uploaded for download
- Build reports generated
- Test execution (configurable)
## Project Structure Template
When you add shared code, use this structure:
```
RSSuper/
├── native-route/
│ ├── common/ # Shared code (if using a shared language)
│ ├── ios/
│ │ └── RSSuper/
│ ├── android/
│ ├── linux/
│ └── windows/
├── scripts/
│ ├── build.sh # Main build orchestrator
│ ├── build-ios.sh # iOS/macOS builder
│ ├── build-android.sh # Android builder
│ ├── build-linux.sh # Linux builder
│ └── common.sh # Shared utilities
└── .github/
└── workflows/
└── ci.yml # CI configuration
```
## Migration Notes
Build scripts adapted from Nessa project:
- Xcode version selection logic
- Build report generation
- Error extraction and display
- CI workflow structure
## Troubleshooting
### iOS Build Fails
```bash
# Check Xcode installation
xcodebuild -version
# Select Xcode manually
sudo xcode-select -s /Applications/Xcode.app/Contents/Developer
# Clean DerivedData
rm -rf ~/Library/Developer/Xcode/DerivedData/RSSuper-*
```
### Android Build Fails
```bash
# Check Java installation
java -version
# Check Android SDK
echo $ANDROID_HOME
# Run with more verbose output
./scripts/build-android.sh debug assemble --info
```
### Linux Build Fails
```bash
# Install dependencies (Ubuntu/Debian)
sudo apt install meson ninja-build pkg-config libgtk-4-dev libadwaita-1-dev
# Check meson installation
meson --version
# Setup build manually
cd native-route/linux
meson setup build --buildtype=debug
```
## Future Enhancements
- [ ] Windows support (Win32 + DirectUI or WebView2)
- [ ] Shared business logic layer
- [ ] Cross-platform test suite
- [ ] Automated code signing
- [ ] App store deployment scripts
- [ ] Performance benchmarking

View File

@@ -1,2 +0,0 @@
.gradle
build

View File

@@ -1,74 +0,0 @@
plugins {
id("com.android.library") version "8.5.0"
id("org.jetbrains.kotlin.android") version "1.9.22"
id("org.jetbrains.kotlin.plugin.parcelize") version "1.9.22"
id("org.jetbrains.kotlin.kapt") version "1.9.22"
}
android {
namespace = "com.rssuper"
compileSdk = 34
defaultConfig {
minSdk = 24
testInstrumentationRunner = "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
isMinifyEnabled = false
}
}
compileOptions {
sourceCompatibility = JavaVersion.VERSION_17
targetCompatibility = JavaVersion.VERSION_17
isCoreLibraryDesugaringEnabled = true
}
kotlinOptions {
jvmTarget = "17"
}
sourceSets {
getByName("main") {
java.srcDirs("src/main/java")
}
}
}
dependencies {
coreLibraryDesugaring("com.android.tools:desugar_jdk_libs:2.0.4")
// AndroidX
implementation("androidx.core:core-ktx:1.12.0")
// XML Parsing - built-in XmlPullParser
implementation("androidx.room:room-runtime:2.6.1")
implementation("androidx.room:room-ktx:2.6.1")
kapt("androidx.room:room-compiler:2.6.1")
// Moshi for JSON serialization
implementation("com.squareup.moshi:moshi:1.15.0")
kapt("com.squareup.moshi:moshi-kotlin-codegen:1.15.0")
implementation("com.squareup.moshi:moshi-kotlin:1.15.0")
// OkHttp for networking
implementation("com.squareup.okhttp3:okhttp:4.12.0")
implementation("com.squareup.okhttp3:logging-interceptor:4.12.0")
// Testing
testImplementation("junit:junit:4.13.2")
testImplementation("com.squareup.moshi:moshi:1.15.0")
testImplementation("com.squareup.moshi:moshi-kotlin:1.15.0")
testImplementation("org.mockito:mockito-core:5.7.0")
testImplementation("org.mockito:mockito-inline:5.2.0")
testImplementation("androidx.room:room-testing:2.6.1")
testImplementation("org.jetbrains.kotlinx:kotlinx-coroutines-test:1.7.3")
testImplementation("androidx.arch.core:core-testing:2.2.0")
testImplementation("androidx.test:core:1.5.0")
testImplementation("androidx.test.ext:junit:1.1.5")
testImplementation("androidx.test:runner:1.5.2")
testImplementation("org.robolectric:robolectric:4.11.1")
testImplementation("com.squareup.okhttp3:mockwebserver:4.12.0")
}

View File

@@ -1,6 +0,0 @@
org.gradle.jvmargs=-Xmx2048m -Dfile.encoding=UTF-8 --add-opens=jdk.compiler/com.sun.tools.javac.main=ALL-UNNAMED --add-opens=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-opens=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED --add-opens=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED --add-opens=jdk.compiler/com.sun.tools.javac.code=ALL-UNNAMED --add-opens=jdk.compiler/com.sun.tools.javac.comp=ALL-UNNAMED --add-opens=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED --add-opens=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED --add-opens=jdk.compiler/com.sun.tools.javac.processing=ALL-UNNAMED
kapt.use.worker.api=false
android.useAndroidX=true
android.enableJetifier=true
kotlin.code.style=official
android.nonTransitiveRClass=true

View File

@@ -1,7 +0,0 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-8.7-bin.zip
networkTimeout=10000
validateDistributionUrl=true
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists

View File

@@ -1,170 +0,0 @@
#!/bin/sh
#
# Copyright 2015-2021 the original author or authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"'
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn () {
echo "$*"
}
die () {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin or MSYS, switch paths to Windows format before running java
if [ "$cygwin" = "true" -o "$msys" = "true" ] ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
fi
# Collect all arguments for the java command;
# * $DEFAULT_JVM_OPTS, $JAVA_OPTS, and $GRADLE_OPTS can contain fragments of
# temporary options; we will parse these below.
# * There is no need to specify -classpath explicitly.
# * Gradle's Java options need to be preprocessed to be merged.
# * We use eval to parse quoted options properly.
# Collect arguments from the command line
set -- \
"-Dorg.gradle.appname=$APP_BASE_NAME" \
-classpath "$CLASSPATH" \
org.gradle.wrapper.GradleWrapperMain \
"$@"
# Stop when "xargs" is not available.
if ! command -v xargs >/dev/null 2>&1
then
die "xargs is not available"
fi
# Use "xargs" to parse quoted args.
#
# With -n://services.gradle.org/distributions/gradle-8.2-bin.zip
# In either case, if the arg is not present, we don't add it.
# If the arg is present but empty, we add it as empty string.
#
eval "set -- $(
printf '%s\n' "$DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS" |
xargs -n1 |
sed ' s~[^-[:alnum:]+,./:=@_]~\\&~g; ' |
tr '\n' ' '
)" '"$@"'
exec "$JAVACMD" "$@"

View File

@@ -1,18 +0,0 @@
pluginManagement {
repositories {
google()
mavenCentral()
gradlePluginPortal()
}
}
dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
repositories {
google()
mavenCentral()
}
}
rootProject.name = "RSSuper"
include(":android")

View File

@@ -1,16 +0,0 @@
package com.rssuper.converters
import androidx.room.TypeConverter
import java.util.Date
class DateConverter {
@TypeConverter
fun fromTimestamp(value: Long?): Date? {
return value?.let { Date(it) }
}
@TypeConverter
fun dateToTimestamp(date: Date?): Long? {
return date?.time
}
}

View File

@@ -1,23 +0,0 @@
package com.rssuper.converters
import androidx.room.TypeConverter
import com.squareup.moshi.Moshi
import com.squareup.moshi.kotlin.reflect.KotlinJsonAdapterFactory
import com.rssuper.models.FeedItem
class FeedItemListConverter {
private val moshi = Moshi.Builder()
.add(KotlinJsonAdapterFactory())
.build()
private val adapter = moshi.adapter(List::class.java)
@TypeConverter
fun fromFeedItemList(value: List<FeedItem>?): String? {
return value?.let { adapter.toJson(it) }
}
@TypeConverter
fun toFeedItemList(value: String?): List<FeedItem>? {
return value?.let { adapter.fromJson(it) as? List<FeedItem> }
}
}

View File

@@ -1,15 +0,0 @@
package com.rssuper.converters
import androidx.room.TypeConverter
class StringListConverter {
@TypeConverter
fun fromStringList(value: List<String>?): String? {
return value?.joinToString(",")
}
@TypeConverter
fun toStringList(value: String?): List<String>? {
return value?.split(",")?.filter { it.isNotEmpty() }
}
}

View File

@@ -1,87 +0,0 @@
package com.rssuper.database
import android.content.Context
import androidx.room.Database
import androidx.room.Entity
import androidx.room.Room
import androidx.room.RoomDatabase
import androidx.room.TypeConverters
import androidx.sqlite.db.SupportSQLiteDatabase
import com.rssuper.converters.DateConverter
import com.rssuper.converters.FeedItemListConverter
import com.rssuper.converters.StringListConverter
import com.rssuper.database.daos.FeedItemDao
import com.rssuper.database.daos.SearchHistoryDao
import com.rssuper.database.daos.SubscriptionDao
import com.rssuper.database.entities.FeedItemEntity
import com.rssuper.database.entities.SearchHistoryEntity
import com.rssuper.database.entities.SubscriptionEntity
import kotlinx.coroutines.CoroutineScope
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.launch
import java.util.Date
@Database(
entities = [
SubscriptionEntity::class,
FeedItemEntity::class,
SearchHistoryEntity::class
],
version = 1,
exportSchema = true
)
@TypeConverters(DateConverter::class, StringListConverter::class, FeedItemListConverter::class)
abstract class RssDatabase : RoomDatabase() {
abstract fun subscriptionDao(): SubscriptionDao
abstract fun feedItemDao(): FeedItemDao
abstract fun searchHistoryDao(): SearchHistoryDao
companion object {
@Volatile
private var INSTANCE: RssDatabase? = null
fun getDatabase(context: Context): RssDatabase {
return INSTANCE ?: synchronized(this) {
val instance = Room.databaseBuilder(
context.applicationContext,
RssDatabase::class.java,
"rss_database"
)
.addCallback(DatabaseCallback())
.build()
INSTANCE = instance
instance
}
}
}
private class DatabaseCallback : RoomDatabase.Callback() {
override fun onCreate(db: SupportSQLiteDatabase) {
super.onCreate(db)
INSTANCE?.let { database ->
CoroutineScope(Dispatchers.IO).launch {
createFTSVirtualTable(db)
}
}
}
override fun onOpen(db: SupportSQLiteDatabase) {
super.onOpen(db)
createFTSVirtualTable(db)
}
private fun createFTSVirtualTable(db: SupportSQLiteDatabase) {
db.execSQL("""
CREATE VIRTUAL TABLE IF NOT EXISTS feed_items_fts USING fts5(
title,
description,
content,
author,
content='feed_items',
contentless_delete=true
)
""".trimIndent())
}
}
}

View File

@@ -1,52 +0,0 @@
package com.rssuper.database.daos
import androidx.room.Dao
import androidx.room.Delete
import androidx.room.Insert
import androidx.room.OnConflictStrategy
import androidx.room.Query
import androidx.room.Update
import com.rssuper.database.entities.BookmarkEntity
import kotlinx.coroutines.flow.Flow
@Dao
interface BookmarkDao {
@Query("SELECT * FROM bookmarks ORDER BY createdAt DESC")
fun getAllBookmarks(): Flow<List<BookmarkEntity>>
@Query("SELECT * FROM bookmarks WHERE id = :id")
suspend fun getBookmarkById(id: String): BookmarkEntity?
@Query("SELECT * FROM bookmarks WHERE feedItemId = :feedItemId")
suspend fun getBookmarkByFeedItemId(feedItemId: String): BookmarkEntity?
@Query("SELECT * FROM bookmarks WHERE tags LIKE '%' || :tag || '%' ORDER BY createdAt DESC")
fun getBookmarksByTag(tag: String): Flow<List<BookmarkEntity>>
@Query("SELECT * FROM bookmarks ORDER BY createdAt DESC LIMIT :limit OFFSET :offset")
suspend fun getBookmarksPaginated(limit: Int, offset: Int): List<BookmarkEntity>
@Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertBookmark(bookmark: BookmarkEntity): Long
@Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertBookmarks(bookmarks: List<BookmarkEntity>): List<Long>
@Update
suspend fun updateBookmark(bookmark: BookmarkEntity): Int
@Delete
suspend fun deleteBookmark(bookmark: BookmarkEntity): Int
@Query("DELETE FROM bookmarks WHERE id = :id")
suspend fun deleteBookmarkById(id: String): Int
@Query("DELETE FROM bookmarks WHERE feedItemId = :feedItemId")
suspend fun deleteBookmarkByFeedItemId(feedItemId: String): Int
@Query("SELECT COUNT(*) FROM bookmarks")
fun getBookmarkCount(): Flow<Int>
@Query("SELECT COUNT(*) FROM bookmarks WHERE tags LIKE '%' || :tag || '%'")
fun getBookmarkCountByTag(tag: String): Flow<Int>
}

View File

@@ -1,80 +0,0 @@
package com.rssuper.database.daos
import androidx.room.Dao
import androidx.room.Delete
import androidx.room.Insert
import androidx.room.OnConflictStrategy
import androidx.room.Query
import androidx.room.Update
import com.rssuper.database.entities.FeedItemEntity
import kotlinx.coroutines.flow.Flow
import java.util.Date
@Dao
interface FeedItemDao {
@Query("SELECT * FROM feed_items WHERE subscriptionId = :subscriptionId ORDER BY published DESC")
fun getItemsBySubscription(subscriptionId: String): Flow<List<FeedItemEntity>>
@Query("SELECT * FROM feed_items WHERE id = :id")
suspend fun getItemById(id: String): FeedItemEntity?
@Query("SELECT * FROM feed_items WHERE subscriptionId IN (:subscriptionIds) ORDER BY published DESC")
fun getItemsBySubscriptions(subscriptionIds: List<String>): Flow<List<FeedItemEntity>>
@Query("SELECT * FROM feed_items WHERE isRead = 0 ORDER BY published DESC")
fun getUnreadItems(): Flow<List<FeedItemEntity>>
@Query("SELECT * FROM feed_items WHERE isStarred = 1 ORDER BY published DESC")
fun getStarredItems(): Flow<List<FeedItemEntity>>
@Query("SELECT * FROM feed_items WHERE published > :date ORDER BY published DESC")
fun getItemsAfterDate(date: Date): Flow<List<FeedItemEntity>>
@Query("SELECT * FROM feed_items WHERE subscriptionId = :subscriptionId AND published > :date ORDER BY published DESC")
fun getSubscriptionItemsAfterDate(subscriptionId: String, date: Date): Flow<List<FeedItemEntity>>
@Query("SELECT COUNT(*) FROM feed_items WHERE subscriptionId = :subscriptionId AND isRead = 0")
fun getUnreadCount(subscriptionId: String): Flow<Int>
@Query("SELECT COUNT(*) FROM feed_items WHERE isRead = 0")
fun getTotalUnreadCount(): Flow<Int>
@Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertItem(item: FeedItemEntity): Long
@Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertItems(items: List<FeedItemEntity>): List<Long>
@Update
suspend fun updateItem(item: FeedItemEntity): Int
@Delete
suspend fun deleteItem(item: FeedItemEntity): Int
@Query("DELETE FROM feed_items WHERE id = :id")
suspend fun deleteItemById(id: String): Int
@Query("DELETE FROM feed_items WHERE subscriptionId = :subscriptionId")
suspend fun deleteItemsBySubscription(subscriptionId: String): Int
@Query("UPDATE feed_items SET isRead = 1 WHERE id = :id")
suspend fun markAsRead(id: String): Int
@Query("UPDATE feed_items SET isRead = 0 WHERE id = :id")
suspend fun markAsUnread(id: String): Int
@Query("UPDATE feed_items SET isStarred = 1 WHERE id = :id")
suspend fun markAsStarred(id: String): Int
@Query("UPDATE feed_items SET isStarred = 0 WHERE id = :id")
suspend fun markAsUnstarred(id: String): Int
@Query("UPDATE feed_items SET isRead = 1 WHERE subscriptionId = :subscriptionId")
suspend fun markAllAsRead(subscriptionId: String): Int
@Query("SELECT * FROM feed_items WHERE subscriptionId = :subscriptionId LIMIT :limit OFFSET :offset")
suspend fun getItemsPaginated(subscriptionId: String, limit: Int, offset: Int): List<FeedItemEntity>
@Query("SELECT * FROM feed_items_fts WHERE feed_items_fts MATCH :query LIMIT :limit")
suspend fun searchByFts(query: String, limit: Int = 20): List<FeedItemEntity>
}

View File

@@ -1,49 +0,0 @@
package com.rssuper.database.daos
import androidx.room.Dao
import androidx.room.Delete
import androidx.room.Insert
import androidx.room.OnConflictStrategy
import androidx.room.Query
import androidx.room.Update
import com.rssuper.database.entities.SearchHistoryEntity
import kotlinx.coroutines.flow.Flow
@Dao
interface SearchHistoryDao {
@Query("SELECT * FROM search_history ORDER BY timestamp DESC")
fun getAllSearchHistory(): Flow<List<SearchHistoryEntity>>
@Query("SELECT * FROM search_history WHERE id = :id")
suspend fun getSearchHistoryById(id: String): SearchHistoryEntity?
@Query("SELECT * FROM search_history WHERE query LIKE '%' || :query || '%' ORDER BY timestamp DESC")
fun searchHistory(query: String): Flow<List<SearchHistoryEntity>>
@Query("SELECT * FROM search_history ORDER BY timestamp DESC LIMIT :limit")
fun getRecentSearches(limit: Int): Flow<List<SearchHistoryEntity>>
@Query("SELECT COUNT(*) FROM search_history")
fun getSearchHistoryCount(): Flow<Int>
@Insert(onConflict = OnConflictStrategy.IGNORE)
suspend fun insertSearchHistory(search: SearchHistoryEntity): Long
@Insert(onConflict = OnConflictStrategy.IGNORE)
suspend fun insertSearchHistories(searches: List<SearchHistoryEntity>): List<Long>
@Update
suspend fun updateSearchHistory(search: SearchHistoryEntity): Int
@Delete
suspend fun deleteSearchHistory(search: SearchHistoryEntity): Int
@Query("DELETE FROM search_history WHERE id = :id")
suspend fun deleteSearchHistoryById(id: String): Int
@Query("DELETE FROM search_history")
suspend fun deleteAllSearchHistory(): Int
@Query("DELETE FROM search_history WHERE timestamp < :timestamp")
suspend fun deleteSearchHistoryOlderThan(timestamp: Long): Int
}

View File

@@ -1,65 +0,0 @@
package com.rssuper.database.daos
import androidx.room.Dao
import androidx.room.Delete
import androidx.room.Insert
import androidx.room.OnConflictStrategy
import androidx.room.Query
import androidx.room.Update
import com.rssuper.database.entities.SubscriptionEntity
import kotlinx.coroutines.flow.Flow
import java.util.Date
@Dao
interface SubscriptionDao {
@Query("SELECT * FROM subscriptions ORDER BY title ASC")
fun getAllSubscriptions(): Flow<List<SubscriptionEntity>>
@Query("SELECT * FROM subscriptions WHERE id = :id")
suspend fun getSubscriptionById(id: String): SubscriptionEntity?
@Query("SELECT * FROM subscriptions WHERE url = :url")
suspend fun getSubscriptionByUrl(url: String): SubscriptionEntity?
@Query("SELECT * FROM subscriptions WHERE enabled = 1 ORDER BY title ASC")
fun getEnabledSubscriptions(): Flow<List<SubscriptionEntity>>
@Query("SELECT * FROM subscriptions WHERE category = :category ORDER BY title ASC")
fun getSubscriptionsByCategory(category: String): Flow<List<SubscriptionEntity>>
@Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertSubscription(subscription: SubscriptionEntity): Long
@Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertSubscriptions(subscriptions: List<SubscriptionEntity>): List<Long>
@Update
suspend fun updateSubscription(subscription: SubscriptionEntity): Int
@Delete
suspend fun deleteSubscription(subscription: SubscriptionEntity): Int
@Query("DELETE FROM subscriptions WHERE id = :id")
suspend fun deleteSubscriptionById(id: String): Int
@Query("SELECT COUNT(*) FROM subscriptions")
fun getSubscriptionCount(): Flow<Int>
@Query("UPDATE subscriptions SET error = :error WHERE id = :id")
suspend fun updateError(id: String, error: String?)
@Query("UPDATE subscriptions SET lastFetchedAt = :lastFetchedAt, error = NULL WHERE id = :id")
suspend fun updateLastFetchedAt(id: String, lastFetchedAt: Date)
@Query("UPDATE subscriptions SET nextFetchAt = :nextFetchAt WHERE id = :id")
suspend fun updateNextFetchAt(id: String, nextFetchAt: Date)
@Query("UPDATE subscriptions SET enabled = :enabled WHERE id = :id")
suspend fun setEnabled(id: String, enabled: Boolean): Int
@Query("UPDATE subscriptions SET lastFetchedAt = :lastFetchedAt, error = NULL WHERE id = :id")
suspend fun updateLastFetchedAtMillis(id: String, lastFetchedAt: Long): Int
@Query("UPDATE subscriptions SET nextFetchAt = :nextFetchAt WHERE id = :id")
suspend fun updateNextFetchAtMillis(id: String, nextFetchAt: Long): Int
}

View File

@@ -1,43 +0,0 @@
package com.rssuper.database.entities
import androidx.room.Entity
import androidx.room.ForeignKey
import androidx.room.Index
import androidx.room.PrimaryKey
import java.util.Date
@Entity(
tableName = "bookmarks",
indices = [Index(value = ["feedItemId"], unique = true)]
)
data class BookmarkEntity(
@PrimaryKey
val id: String,
val feedItemId: String,
val title: String,
val link: String? = null,
val description: String? = null,
val content: String? = null,
val createdAt: Date,
val tags: String? = null
) {
fun toFeedItem(): FeedItemEntity {
return FeedItemEntity(
id = feedItemId,
subscriptionId = "", // Will be set when linked to subscription
title = title,
link = link,
description = description,
content = content,
published = createdAt,
updated = createdAt
)
}
}

View File

@@ -1,57 +0,0 @@
package com.rssuper.database.entities
import androidx.room.Entity
import androidx.room.ForeignKey
import androidx.room.Index
import androidx.room.PrimaryKey
import java.util.Date
@Entity(
tableName = "feed_items",
foreignKeys = [
ForeignKey(
entity = SubscriptionEntity::class,
parentColumns = ["id"],
childColumns = ["subscriptionId"],
onDelete = ForeignKey.CASCADE
)
],
indices = [
Index(value = ["subscriptionId"]),
Index(value = ["published"])
]
)
data class FeedItemEntity(
@PrimaryKey
val id: String,
val subscriptionId: String,
val title: String,
val link: String? = null,
val description: String? = null,
val content: String? = null,
val author: String? = null,
val published: Date? = null,
val updated: Date? = null,
val categories: String? = null,
val enclosureUrl: String? = null,
val enclosureType: String? = null,
val enclosureLength: Long? = null,
val guid: String? = null,
val isRead: Boolean = false,
val isStarred: Boolean = false
)

View File

@@ -1,19 +0,0 @@
package com.rssuper.database.entities
import androidx.room.Entity
import androidx.room.Index
import androidx.room.PrimaryKey
import java.util.Date
@Entity(
tableName = "search_history",
indices = [Index(value = ["timestamp"])]
)
data class SearchHistoryEntity(
@PrimaryKey
val id: String,
val query: String,
val timestamp: Date
)

View File

@@ -1,54 +0,0 @@
package com.rssuper.database.entities
import androidx.room.Entity
import androidx.room.ForeignKey
import androidx.room.Index
import androidx.room.PrimaryKey
import com.rssuper.models.HttpAuth
import java.util.Date
@Entity(
tableName = "subscriptions",
indices = [Index(value = ["url"], unique = true)]
)
data class SubscriptionEntity(
@PrimaryKey
val id: String,
val url: String,
val title: String,
val category: String? = null,
val enabled: Boolean = true,
val fetchInterval: Long = 3600000,
val createdAt: Date,
val updatedAt: Date,
val lastFetchedAt: Date? = null,
val nextFetchAt: Date? = null,
val error: String? = null,
val httpAuthUsername: String? = null,
val httpAuthPassword: String? = null
) {
fun toHttpAuth(): HttpAuth? {
return if (httpAuthUsername != null && httpAuthPassword != null) {
HttpAuth(httpAuthUsername, httpAuthPassword)
} else null
}
fun fromHttpAuth(auth: HttpAuth?): SubscriptionEntity {
return copy(
httpAuthUsername = auth?.username,
httpAuthPassword = auth?.password
)
}
}

View File

@@ -1,9 +0,0 @@
package com.rssuper.model
sealed interface Error {
data class Network(val message: String, val code: Int? = null) : Error
data class Database(val message: String, val cause: Throwable? = null) : Error
data class Parsing(val message: String, val cause: Throwable? = null) : Error
data class Auth(val message: String) : Error
data object Unknown : Error
}

View File

@@ -1,8 +0,0 @@
package com.rssuper.model
sealed interface State<out T> {
data object Idle : State<Nothing>
data object Loading : State<Nothing>
data class Success<T>(val data: T) : State<T>
data class Error(val message: String, val cause: Throwable? = null) : State<Nothing>
}

View File

@@ -1,60 +0,0 @@
package com.rssuper.models
import android.os.Parcelable
import androidx.room.Entity
import androidx.room.PrimaryKey
import androidx.room.TypeConverters
import com.rssuper.converters.DateConverter
import com.rssuper.converters.FeedItemListConverter
import kotlinx.parcelize.Parcelize
import com.squareup.moshi.Json
import com.squareup.moshi.JsonClass
import java.util.Date
@JsonClass(generateAdapter = true)
@Parcelize
@TypeConverters(DateConverter::class, FeedItemListConverter::class)
@Entity(tableName = "feeds")
data class Feed(
@PrimaryKey
val id: String,
@Json(name = "title")
val title: String,
@Json(name = "link")
val link: String? = null,
@Json(name = "description")
val description: String? = null,
@Json(name = "subtitle")
val subtitle: String? = null,
@Json(name = "language")
val language: String? = null,
@Json(name = "lastBuildDate")
val lastBuildDate: Date? = null,
@Json(name = "updated")
val updated: Date? = null,
@Json(name = "generator")
val generator: String? = null,
@Json(name = "ttl")
val ttl: Int? = null,
@Json(name = "items")
val items: List<FeedItem> = emptyList(),
@Json(name = "rawUrl")
val rawUrl: String,
@Json(name = "lastFetchedAt")
val lastFetchedAt: Date? = null,
@Json(name = "nextFetchAt")
val nextFetchAt: Date? = null
) : Parcelable

View File

@@ -1,67 +0,0 @@
package com.rssuper.models
import android.os.Parcelable
import androidx.room.Entity
import androidx.room.PrimaryKey
import androidx.room.TypeConverters
import com.rssuper.converters.DateConverter
import com.rssuper.converters.StringListConverter
import kotlinx.parcelize.Parcelize
import com.squareup.moshi.Json
import com.squareup.moshi.JsonClass
import java.util.Date
@JsonClass(generateAdapter = true)
@Parcelize
@TypeConverters(DateConverter::class, StringListConverter::class)
@Entity(tableName = "feed_items")
data class FeedItem(
@PrimaryKey
val id: String,
@Json(name = "title")
val title: String,
@Json(name = "link")
val link: String? = null,
@Json(name = "description")
val description: String? = null,
@Json(name = "content")
val content: String? = null,
@Json(name = "author")
val author: String? = null,
@Json(name = "published")
val published: Date? = null,
@Json(name = "updated")
val updated: Date? = null,
@Json(name = "categories")
val categories: List<String>? = null,
@Json(name = "enclosure")
val enclosure: Enclosure? = null,
@Json(name = "guid")
val guid: String? = null,
@Json(name = "subscriptionTitle")
val subscriptionTitle: String? = null
) : Parcelable
@JsonClass(generateAdapter = true)
@Parcelize
data class Enclosure(
@Json(name = "url")
val url: String,
@Json(name = "type")
val type: String,
@Json(name = "length")
val length: Long? = null
) : Parcelable

View File

@@ -1,63 +0,0 @@
package com.rssuper.models
import android.os.Parcelable
import androidx.room.Entity
import androidx.room.PrimaryKey
import androidx.room.TypeConverters
import com.rssuper.converters.DateConverter
import kotlinx.parcelize.Parcelize
import com.squareup.moshi.Json
import com.squareup.moshi.JsonClass
import java.util.Date
@JsonClass(generateAdapter = true)
@Parcelize
@TypeConverters(DateConverter::class)
@Entity(tableName = "feed_subscriptions")
data class FeedSubscription(
@PrimaryKey
val id: String,
@Json(name = "url")
val url: String,
@Json(name = "title")
val title: String,
@Json(name = "category")
val category: String? = null,
@Json(name = "enabled")
val enabled: Boolean = true,
@Json(name = "fetchInterval")
val fetchInterval: Long,
@Json(name = "createdAt")
val createdAt: Date,
@Json(name = "updatedAt")
val updatedAt: Date,
@Json(name = "lastFetchedAt")
val lastFetchedAt: Date? = null,
@Json(name = "nextFetchAt")
val nextFetchAt: Date? = null,
@Json(name = "error")
val error: String? = null,
@Json(name = "httpAuth")
val httpAuth: HttpAuth? = null
) : Parcelable
@JsonClass(generateAdapter = true)
@Parcelize
data class HttpAuth(
@Json(name = "username")
val username: String,
@Json(name = "password")
val password: String
) : Parcelable

View File

@@ -1,34 +0,0 @@
package com.rssuper.models
import android.os.Parcelable
import androidx.room.Entity
import androidx.room.PrimaryKey
import kotlinx.parcelize.Parcelize
import com.squareup.moshi.Json
import com.squareup.moshi.JsonClass
@JsonClass(generateAdapter = true)
@Parcelize
@Entity(tableName = "notification_preferences")
data class NotificationPreferences(
@PrimaryKey
val id: String = "default",
@Json(name = "newArticles")
val newArticles: Boolean = true,
@Json(name = "episodeReleases")
val episodeReleases: Boolean = true,
@Json(name = "customAlerts")
val customAlerts: Boolean = false,
@Json(name = "badgeCount")
val badgeCount: Boolean = true,
@Json(name = "sound")
val sound: Boolean = true,
@Json(name = "vibration")
val vibration: Boolean = true
) : Parcelable

View File

@@ -1,60 +0,0 @@
package com.rssuper.models
import android.os.Parcelable
import androidx.room.Entity
import androidx.room.PrimaryKey
import kotlinx.parcelize.Parcelize
import kotlinx.parcelize.RawValue
import com.squareup.moshi.Json
import com.squareup.moshi.JsonClass
@JsonClass(generateAdapter = true)
@Parcelize
@Entity(tableName = "reading_preferences")
data class ReadingPreferences(
@PrimaryKey
val id: String = "default",
@Json(name = "fontSize")
val fontSize: @RawValue FontSize = FontSize.MEDIUM,
@Json(name = "lineHeight")
val lineHeight: @RawValue LineHeight = LineHeight.NORMAL,
@Json(name = "showTableOfContents")
val showTableOfContents: Boolean = false,
@Json(name = "showReadingTime")
val showReadingTime: Boolean = true,
@Json(name = "showAuthor")
val showAuthor: Boolean = true,
@Json(name = "showDate")
val showDate: Boolean = true
) : Parcelable
sealed class FontSize(val value: String) {
@Json(name = "small")
data object SMALL : FontSize("small")
@Json(name = "medium")
data object MEDIUM : FontSize("medium")
@Json(name = "large")
data object LARGE : FontSize("large")
@Json(name = "xlarge")
data object XLARGE : FontSize("xlarge")
}
sealed class LineHeight(val value: String) {
@Json(name = "normal")
data object NORMAL : LineHeight("normal")
@Json(name = "relaxed")
data object RELAXED : LineHeight("relaxed")
@Json(name = "loose")
data object LOOSE : LineHeight("loose")
}

View File

@@ -1,74 +0,0 @@
package com.rssuper.models
import android.os.Parcelable
import androidx.room.Entity
import androidx.room.PrimaryKey
import androidx.room.TypeConverters
import com.rssuper.converters.DateConverter
import com.rssuper.converters.StringListConverter
import kotlinx.parcelize.Parcelize
import kotlinx.parcelize.RawValue
import com.squareup.moshi.Json
import com.squareup.moshi.JsonClass
import java.util.Date
@JsonClass(generateAdapter = true)
@Parcelize
@TypeConverters(DateConverter::class, StringListConverter::class)
@Entity(tableName = "search_filters")
data class SearchFilters(
@PrimaryKey
val id: String = "default",
@Json(name = "dateFrom")
val dateFrom: Date? = null,
@Json(name = "dateTo")
val dateTo: Date? = null,
@Json(name = "feedIds")
val feedIds: List<String>? = null,
@Json(name = "authors")
val authors: List<String>? = null,
@Json(name = "contentType")
val contentType: @RawValue ContentType? = null,
@Json(name = "sortOption")
val sortOption: @RawValue SearchSortOption = SearchSortOption.RELEVANCE
) : Parcelable
sealed class ContentType(val value: String) {
@Json(name = "article")
data object ARTICLE : ContentType("article")
@Json(name = "audio")
data object AUDIO : ContentType("audio")
@Json(name = "video")
data object VIDEO : ContentType("video")
}
sealed class SearchSortOption(val value: String) {
@Json(name = "relevance")
data object RELEVANCE : SearchSortOption("relevance")
@Json(name = "date_desc")
data object DATE_DESC : SearchSortOption("date_desc")
@Json(name = "date_asc")
data object DATE_ASC : SearchSortOption("date_asc")
@Json(name = "title_asc")
data object TITLE_ASC : SearchSortOption("title_asc")
@Json(name = "title_desc")
data object TITLE_DESC : SearchSortOption("title_desc")
@Json(name = "feed_asc")
data object FEED_ASC : SearchSortOption("feed_asc")
@Json(name = "feed_desc")
data object FEED_DESC : SearchSortOption("feed_desc")
}

View File

@@ -1,49 +0,0 @@
package com.rssuper.models
import android.os.Parcelable
import androidx.room.Entity
import androidx.room.PrimaryKey
import androidx.room.TypeConverters
import com.rssuper.converters.DateConverter
import kotlinx.parcelize.Parcelize
import com.squareup.moshi.Json
import com.squareup.moshi.JsonClass
import java.util.Date
@JsonClass(generateAdapter = true)
@Parcelize
@TypeConverters(DateConverter::class)
@Entity(tableName = "search_results")
data class SearchResult(
@PrimaryKey
val id: String,
@Json(name = "type")
val type: SearchResultType,
@Json(name = "title")
val title: String,
@Json(name = "snippet")
val snippet: String? = null,
@Json(name = "link")
val link: String? = null,
@Json(name = "feedTitle")
val feedTitle: String? = null,
@Json(name = "published")
val published: Date? = null,
@Json(name = "score")
val score: Double? = null
) : Parcelable
enum class SearchResultType {
@Json(name = "article")
ARTICLE,
@Json(name = "feed")
FEED
}

View File

@@ -1,240 +0,0 @@
package com.rssuper.parsing
import com.rssuper.models.Enclosure
import com.rssuper.models.Feed
import com.rssuper.models.FeedItem
import org.xmlpull.v1.XmlPullParser
import org.xmlpull.v1.XmlPullParserFactory
import java.io.StringReader
object AtomParser {
private val ATOM_NS = "http://www.w3.org/2005/Atom"
private val ITUNES_NS = "http://www.itunes.com/dtds/podcast-1.0.dtd"
private val MEDIA_NS = "http://search.yahoo.com/mrss/"
fun parse(xml: String, feedUrl: String): Feed {
val factory = XmlPullParserFactory.newInstance()
factory.isNamespaceAware = true
val parser = factory.newPullParser()
parser.setInput(StringReader(xml))
var title: String? = null
var link: String? = null
var subtitle: String? = null
var updated: java.util.Date? = null
var generator: String? = null
val items = mutableListOf<FeedItem>()
var currentItem: MutableMap<String, Any?>? = null
var currentTag: String? = null
var inContent = false
var eventType = parser.eventType
while (eventType != XmlPullParser.END_DOCUMENT) {
when (eventType) {
XmlPullParser.START_TAG -> {
val tagName = parser.name
val namespace = parser.namespace
when {
tagName == "feed" -> {}
tagName == "entry" -> {
currentItem = mutableMapOf()
}
tagName == "title" -> {
currentTag = tagName
inContent = true
}
tagName == "link" -> {
val href = parser.getAttributeValue(null, "href")
val rel = parser.getAttributeValue(null, "rel")
if (href != null) {
if (currentItem != null) {
if (rel == "alternate" || rel == null) {
currentItem["link"] = href
} else if (rel == "enclosure") {
val type = parser.getAttributeValue(null, "type") ?: "application/octet-stream"
val length = parser.getAttributeValue(null, "length")?.toLongOrNull()
currentItem["enclosure"] = Enclosure(href, type, length)
}
} else {
if (rel == "alternate" || rel == null) {
link = href
}
}
}
currentTag = null
inContent = false
}
tagName == "subtitle" -> {
currentTag = tagName
inContent = true
}
tagName == "summary" -> {
currentTag = tagName
inContent = true
}
tagName == "content" -> {
currentTag = tagName
inContent = true
}
tagName == "updated" || tagName == "published" -> {
currentTag = tagName
inContent = true
}
tagName == "name" -> {
currentTag = tagName
inContent = true
}
tagName == "uri" -> {
currentTag = tagName
inContent = true
}
tagName == "id" -> {
currentTag = tagName
inContent = true
}
tagName == "category" -> {
val term = parser.getAttributeValue(null, "term")
if (term != null && currentItem != null) {
val cats = currentItem["categories"] as? MutableList<String> ?: mutableListOf()
cats.add(term)
currentItem["categories"] = cats
}
currentTag = null
inContent = false
}
tagName == "generator" -> {
currentTag = tagName
inContent = true
}
tagName == "summary" && namespace == ITUNES_NS -> {
if (currentItem != null) {
currentItem["itunesSummary"] = readElementText(parser)
}
}
tagName == "image" && namespace == ITUNES_NS -> {
val href = parser.getAttributeValue(null, "href")
if (href != null && currentItem != null) {
currentItem["image"] = href
}
}
tagName == "duration" && namespace == ITUNES_NS -> {
currentItem?.put("duration", readElementText(parser))
}
tagName == "thumbnail" && namespace == MEDIA_NS -> {
val url = parser.getAttributeValue(null, "url")
if (url != null && currentItem != null) {
currentItem["mediaThumbnail"] = url
}
}
tagName == "enclosure" && namespace == MEDIA_NS -> {
val url = parser.getAttributeValue(null, "url")
val type = parser.getAttributeValue(null, "type")
val length = parser.getAttributeValue(null, "length")?.toLongOrNull()
if (url != null && type != null && currentItem != null) {
currentItem["enclosure"] = Enclosure(url, type, length)
}
}
else -> {}
}
}
XmlPullParser.TEXT -> {
val text = parser.text?.xmlTrimmed() ?: ""
if (text.isNotEmpty() && inContent) {
if (currentItem != null) {
when (currentTag) {
"title" -> currentItem["title"] = text
"summary" -> currentItem["summary"] = text
"content" -> currentItem["content"] = text
"name" -> currentItem["author"] = text
"id" -> currentItem["guid"] = text
"updated", "published" -> currentItem[currentTag] = text
}
} else {
when (currentTag) {
"title" -> title = text
"subtitle" -> subtitle = text
"id" -> if (title == null) title = text
"updated" -> updated = XmlDateParser.parse(text)
"generator" -> generator = text
}
}
}
}
XmlPullParser.END_TAG -> {
val tagName = parser.name
if (tagName == "entry" && currentItem != null) {
items.add(buildFeedItem(currentItem))
currentItem = null
}
if (tagName == currentTag) {
currentTag = null
inContent = false
}
}
}
eventType = parser.next()
}
return Feed(
id = generateUuid(),
title = title ?: "Untitled Feed",
link = link,
subtitle = subtitle,
description = subtitle,
updated = updated,
generator = generator,
items = items,
rawUrl = feedUrl,
lastFetchedAt = java.util.Date()
)
}
private fun readElementText(parser: XmlPullParser): String {
var text = ""
var eventType = parser.eventType
while (eventType != XmlPullParser.END_TAG) {
if (eventType == XmlPullParser.TEXT) {
text = parser.text.xmlDecoded()
}
eventType = parser.next()
}
return text.xmlTrimmed()
}
@Suppress("UNCHECKED_CAST")
private fun buildFeedItem(item: Map<String, Any?>): FeedItem {
val title = item["title"] as? String ?: "Untitled"
val link = item["link"] as? String
val summary = item["summary"] as? String
val content = item["content"] as? String ?: summary
val itunesSummary = item["itunesSummary"] as? String
val author = item["author"] as? String
val guid = item["guid"] as? String ?: link ?: generateUuid()
val categories = item["categories"] as? List<String>
val enclosure = item["enclosure"] as? Enclosure
val updatedStr = item["updated"] as? String
val publishedStr = item["published"] as? String
val published = XmlDateParser.parse(publishedStr ?: updatedStr)
val updated = XmlDateParser.parse(updatedStr)
return FeedItem(
id = generateUuid(),
title = title,
link = link,
description = summary ?: itunesSummary,
content = content,
author = author,
published = published,
updated = updated,
categories = categories,
enclosure = enclosure,
guid = guid
)
}
}

View File

@@ -1,67 +0,0 @@
package com.rssuper.parsing
import com.rssuper.models.Feed
import org.xmlpull.v1.XmlPullParser
import org.xmlpull.v1.XmlPullParserFactory
import java.io.StringReader
import java.util.Date
object FeedParser {
fun parse(xml: String, feedUrl: String): ParseResult {
val feedType = detectFeedType(xml)
return when (feedType) {
FeedType.RSS -> {
val feed = RSSParser.parse(xml, feedUrl)
ParseResult(FeedType.RSS, feed)
}
FeedType.Atom -> {
val feed = AtomParser.parse(xml, feedUrl)
ParseResult(FeedType.Atom, feed)
}
}
}
fun parseAsync(xml: String, feedUrl: String, callback: (Result<ParseResult>) -> Unit) {
try {
val result = parse(xml, feedUrl)
callback(Result.success(result))
} catch (e: Exception) {
callback(Result.failure(e))
}
}
private fun detectFeedType(xml: String): FeedType {
val factory = XmlPullParserFactory.newInstance()
factory.isNamespaceAware = true
val parser = factory.newPullParser()
parser.setInput(StringReader(xml))
var eventType = parser.eventType
while (eventType != XmlPullParser.END_DOCUMENT) {
if (eventType == XmlPullParser.START_TAG) {
val tagName = parser.name
return when {
tagName.equals("rss", ignoreCase = true) -> FeedType.RSS
tagName.equals("feed", ignoreCase = true) -> FeedType.Atom
tagName.equals("RDF", ignoreCase = true) -> FeedType.RSS
else -> {
val namespace = parser.namespace
if (namespace != null && namespace.isNotEmpty()) {
when {
tagName.equals("rss", ignoreCase = true) -> FeedType.RSS
tagName.equals("feed", ignoreCase = true) -> FeedType.Atom
else -> throw FeedParsingError.UnsupportedFeedType
}
} else {
throw FeedParsingError.UnsupportedFeedType
}
}
}
}
eventType = parser.next()
}
throw FeedParsingError.UnsupportedFeedType
}
}

View File

@@ -1,16 +0,0 @@
package com.rssuper.parsing
sealed class FeedType(val value: String) {
data object RSS : FeedType("rss")
data object Atom : FeedType("atom")
companion object {
fun fromString(value: String): FeedType {
return when (value.lowercase()) {
"rss" -> RSS
"atom" -> Atom
else -> throw IllegalArgumentException("Unknown feed type: $value")
}
}
}
}

View File

@@ -1,13 +0,0 @@
package com.rssuper.parsing
import com.rssuper.models.Feed
data class ParseResult(
val feedType: FeedType,
val feed: Feed
)
sealed class FeedParsingError : Exception() {
data object UnsupportedFeedType : FeedParsingError()
data object MalformedXml : FeedParsingError()
}

View File

@@ -1,188 +0,0 @@
package com.rssuper.parsing
import com.rssuper.models.Enclosure
import com.rssuper.models.Feed
import com.rssuper.models.FeedItem
import org.xmlpull.v1.XmlPullParser
import org.xmlpull.v1.XmlPullParserFactory
import java.io.StringReader
import java.util.Date
object RSSParser {
private val ITUNES_NS = "http://www.itunes.com/dtds/podcast-1.0.dtd"
private val CONTENT_NS = "http://purl.org/rss/1.0/modules/content/"
fun parse(xml: String, feedUrl: String): Feed {
val factory = XmlPullParserFactory.newInstance()
factory.isNamespaceAware = true
val parser = factory.newPullParser()
parser.setInput(StringReader(xml))
var title: String? = null
var link: String? = null
var description: String? = null
var language: String? = null
var lastBuildDate: Date? = null
var generator: String? = null
var ttl: Int? = null
val items = mutableListOf<FeedItem>()
var currentItem: MutableMap<String, Any?>? = null
var currentTag: String? = null
var eventType = parser.eventType
while (eventType != XmlPullParser.END_DOCUMENT) {
when (eventType) {
XmlPullParser.START_TAG -> {
val tagName = parser.name
val namespace = parser.namespace
when {
tagName == "channel" -> {}
tagName == "item" -> {
currentItem = mutableMapOf()
}
tagName == "title" || tagName == "description" ||
tagName == "link" || tagName == "author" ||
tagName == "guid" || tagName == "pubDate" ||
tagName == "category" || tagName == "enclosure" -> {
currentTag = tagName
}
tagName == "language" -> currentTag = tagName
tagName == "lastBuildDate" -> currentTag = tagName
tagName == "generator" -> currentTag = tagName
tagName == "ttl" -> currentTag = tagName
tagName == "subtitle" && namespace == ITUNES_NS -> {
if (currentItem == null) {
description = readElementText(parser)
}
}
tagName == "summary" && namespace == ITUNES_NS -> {
currentItem?.put("description", readElementText(parser))
}
tagName == "duration" && namespace == ITUNES_NS -> {
currentItem?.put("duration", readElementText(parser))
}
tagName == "image" && namespace == ITUNES_NS -> {
val href = parser.getAttributeValue(null, "href")
if (href != null && currentItem != null) {
currentItem.put("image", href)
}
}
tagName == "encoded" && namespace == CONTENT_NS -> {
currentItem?.put("content", readElementText(parser))
}
else -> {}
}
if (tagName == "enclosure" && currentItem != null) {
val url = parser.getAttributeValue(null, "url")
val type = parser.getAttributeValue(null, "type")
val length = parser.getAttributeValue(null, "length")?.toLongOrNull()
if (url != null && type != null) {
currentItem["enclosure"] = Enclosure(url, type, length)
}
}
}
XmlPullParser.TEXT -> {
val text = parser.text?.xmlTrimmed() ?: ""
if (text.isNotEmpty()) {
if (currentItem != null) {
when (currentTag) {
"title" -> currentItem["title"] = text
"description" -> currentItem["description"] = text
"link" -> currentItem["link"] = text
"author" -> currentItem["author"] = text
"guid" -> currentItem["guid"] = text
"pubDate" -> currentItem["pubDate"] = text
"category" -> {
val cats = currentItem["categories"] as? MutableList<String> ?: mutableListOf()
cats.add(text)
currentItem["categories"] = cats
}
}
} else {
when (currentTag) {
"title" -> title = text
"link" -> link = text
"description" -> description = text
"language" -> language = text
"lastBuildDate" -> lastBuildDate = XmlDateParser.parse(text)
"generator" -> generator = text
"ttl" -> ttl = text.toIntOrNull()
}
}
}
}
XmlPullParser.END_TAG -> {
val tagName = parser.name
if (tagName == "item" && currentItem != null) {
items.add(buildFeedItem(currentItem))
currentItem = null
}
currentTag = null
}
}
eventType = parser.next()
}
return Feed(
id = generateUuid(),
title = title ?: "Untitled Feed",
link = link,
description = description,
language = language,
lastBuildDate = lastBuildDate,
generator = generator,
ttl = ttl,
items = items,
rawUrl = feedUrl,
lastFetchedAt = Date()
)
}
private fun readElementText(parser: XmlPullParser): String {
var text = ""
var eventType = parser.eventType
while (eventType != XmlPullParser.END_TAG) {
if (eventType == XmlPullParser.TEXT) {
text = parser.text.xmlDecoded()
}
eventType = parser.next()
}
return text.xmlTrimmed()
}
@Suppress("UNCHECKED_CAST")
private fun buildFeedItem(item: Map<String, Any?>): FeedItem {
val title = item["title"] as? String ?: "Untitled"
val link = item["link"] as? String
val description = item["description"] as? String
val content = item["content"] as? String ?: description
val author = item["author"] as? String
val guid = item["guid"] as? String ?: link ?: generateUuid()
val categories = item["categories"] as? List<String>
val enclosure = item["enclosure"] as? Enclosure
val pubDateStr = item["pubDate"] as? String
val published = XmlDateParser.parse(pubDateStr)
return FeedItem(
id = generateUuid(),
title = title,
link = link,
description = description,
content = content,
author = author,
published = published,
updated = published,
categories = categories,
enclosure = enclosure,
guid = guid
)
}
}

View File

@@ -1,154 +0,0 @@
package com.rssuper.parsing
import java.text.SimpleDateFormat
import java.util.Locale
import java.util.TimeZone
import java.util.UUID
import java.util.regex.Pattern
object XmlDateParser {
private val iso8601WithFractional: SimpleDateFormat by lazy {
SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSXXX", Locale.US).apply {
timeZone = TimeZone.getTimeZone("UTC")
}
}
private val iso8601: SimpleDateFormat by lazy {
SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ssXXX", Locale.US).apply {
timeZone = TimeZone.getTimeZone("UTC")
}
}
private val dateFormats: List<SimpleDateFormat> by lazy {
listOf(
SimpleDateFormat("EEE, dd MMM yyyy HH:mm:ss Z", Locale.US),
SimpleDateFormat("EEE, dd MMM yyyy HH:mm Z", Locale.US),
SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ssZ", Locale.US),
SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ", Locale.US),
SimpleDateFormat("yyyy-MM-dd HH:mm:ss Z", Locale.US),
SimpleDateFormat("yyyy-MM-dd", Locale.US)
).map {
SimpleDateFormat(it.toPattern(), Locale.US).apply {
timeZone = TimeZone.getTimeZone("UTC")
}
}
}
fun parse(value: String?): java.util.Date? {
val trimmed = value?.xmlTrimmed() ?: return null
if (trimmed.isEmpty()) return null
return try {
iso8601WithFractional.parse(trimmed)
} catch (e: Exception) {
try {
iso8601.parse(trimmed)
} catch (e: Exception) {
for (format in dateFormats) {
try {
return format.parse(trimmed)
} catch (e: Exception) {
continue
}
}
null
}
}
}
}
fun String.xmlTrimmed(): String = this.trim { it <= ' ' }
fun String.xmlNilIfEmpty(): String? {
val trimmed = this.xmlTrimmed()
return if (trimmed.isEmpty()) null else trimmed
}
fun String.xmlDecoded(): String {
return this
.replace(Regex("<!\\[CDATA\\[", RegexOption.IGNORE_CASE), "")
.replace(Regex("\\]\\]>", RegexOption.IGNORE_CASE), "")
.replace("&lt;", "<")
.replace("&gt;", ">")
.replace("&amp;", "&")
.replace("&quot;", "\"")
.replace("&apos;", "'")
.replace("&#39;", "'")
.replace("&#x27;", "'")
}
fun xmlInt64(value: String?): Long? {
val trimmed = value?.xmlTrimmed() ?: return null
if (trimmed.isEmpty()) return null
return trimmed.toLongOrNull()
}
fun xmlInt(value: String?): Int? {
val trimmed = value?.xmlTrimmed() ?: return null
if (trimmed.isEmpty()) return null
return trimmed.toIntOrNull()
}
fun xmlFirstTagValue(tag: String, inXml: String): String? {
val pattern = Pattern.compile("(?is)<(?:\\w+:)?$tag\\b[^>]*>(.*?)</(?:\\w+:)?$tag}>", Pattern.CASE_INSENSITIVE)
val matcher = pattern.matcher(inXml)
return if (matcher.find()) {
matcher.group(1)?.xmlDecoded()?.xmlTrimmed()
} else {
null
}
}
fun xmlAllTagValues(tag: String, inXml: String): List<String> {
val pattern = Pattern.compile("(?is)<(?:\\w+:)?$tag\\b[^>]*>(.*?)</(?:\\w+:)?$tag}>", Pattern.CASE_INSENSITIVE)
val matcher = pattern.matcher(inXml)
val results = mutableListOf<String>()
while (matcher.find()) {
matcher.group(1)?.xmlDecoded()?.xmlTrimmed()?.let { value ->
if (value.isNotEmpty()) {
results.add(value)
}
}
}
return results
}
fun xmlFirstBlock(tag: String, inXml: String): String? {
val pattern = Pattern.compile("(?is)<(?:\\w+:)?$tag\\b[^>]*>(.*?)</(?:\\w+:)?$tag}>", Pattern.CASE_INSENSITIVE)
val matcher = pattern.matcher(inXml)
return if (matcher.find()) matcher.group(1) else null
}
fun xmlAllBlocks(tag: String, inXml: String): List<String> {
val pattern = Pattern.compile("(?is)<(?:\\w+:)?$tag\\b[^>]*>(.*?)</(?:\\w+:)?$tag}>", Pattern.CASE_INSENSITIVE)
val matcher = pattern.matcher(inXml)
val results = mutableListOf<String>()
while (matcher.find()) {
matcher.group(1)?.let { results.add(it) }
}
return results
}
fun xmlAllTagAttributes(tag: String, inXml: String): List<Map<String, String>> {
val pattern = Pattern.compile("(?is)<(?:\\w+:)?$tag\\b([^>]*)/?>", Pattern.CASE_INSENSITIVE)
val matcher = pattern.matcher(inXml)
val results = mutableListOf<Map<String, String>>()
while (matcher.find()) {
matcher.group(1)?.let { results.add(parseXmlAttributes(it)) }
}
return results
}
private fun parseXmlAttributes(raw: String): Map<String, String> {
val pattern = Pattern.compile("(\\w+(?::\\w+)?)\\s*=\\s*\"([^\"]*)\"")
val matcher = pattern.matcher(raw)
val result = mutableMapOf<String, String>()
while (matcher.find()) {
val key = matcher.group(1)?.lowercase() ?: continue
val value = matcher.group(2)?.xmlDecoded()?.xmlTrimmed() ?: continue
result[key] = value
}
return result
}
fun generateUuid(): String = UUID.randomUUID().toString()

View File

@@ -1,91 +0,0 @@
package com.rssuper.repository
import com.rssuper.database.daos.BookmarkDao
import com.rssuper.database.entities.BookmarkEntity
import com.rssuper.state.BookmarkState
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.flow.map
class BookmarkRepository(
private val bookmarkDao: BookmarkDao
) {
fun getAllBookmarks(): Flow<BookmarkState> {
return bookmarkDao.getAllBookmarks().map { bookmarks ->
BookmarkState.Success(bookmarks)
}.catch { e ->
emit(BookmarkState.Error("Failed to load bookmarks", e))
}
}
fun getBookmarksByTag(tag: String): Flow<BookmarkState> {
return bookmarkDao.getBookmarksByTag(tag).map { bookmarks ->
BookmarkState.Success(bookmarks)
}.catch { e ->
emit(BookmarkState.Error("Failed to load bookmarks by tag", e))
}
}
suspend fun getBookmarkById(id: String): BookmarkEntity? {
return try {
bookmarkDao.getBookmarkById(id)
} catch (e: Exception) {
throw RuntimeException("Failed to get bookmark", e)
}
}
suspend fun getBookmarkByFeedItemId(feedItemId: String): BookmarkEntity? {
return try {
bookmarkDao.getBookmarkByFeedItemId(feedItemId)
} catch (e: Exception) {
throw RuntimeException("Failed to get bookmark by feed item ID", e)
}
}
suspend fun insertBookmark(bookmark: BookmarkEntity): Long {
return try {
bookmarkDao.insertBookmark(bookmark)
} catch (e: Exception) {
throw RuntimeException("Failed to insert bookmark", e)
}
}
suspend fun insertBookmarks(bookmarks: List<BookmarkEntity>): List<Long> {
return try {
bookmarkDao.insertBookmarks(bookmarks)
} catch (e: Exception) {
throw RuntimeException("Failed to insert bookmarks", e)
}
}
suspend fun updateBookmark(bookmark: BookmarkEntity): Int {
return try {
bookmarkDao.updateBookmark(bookmark)
} catch (e: Exception) {
throw RuntimeException("Failed to update bookmark", e)
}
}
suspend fun deleteBookmark(bookmark: BookmarkEntity): Int {
return try {
bookmarkDao.deleteBookmark(bookmark)
} catch (e: Exception) {
throw RuntimeException("Failed to delete bookmark", e)
}
}
suspend fun deleteBookmarkById(id: String): Int {
return try {
bookmarkDao.deleteBookmarkById(id)
} catch (e: Exception) {
throw RuntimeException("Failed to delete bookmark by ID", e)
}
}
suspend fun deleteBookmarkByFeedItemId(feedItemId: String): Int {
return try {
bookmarkDao.deleteBookmarkByFeedItemId(feedItemId)
} catch (e: Exception) {
throw RuntimeException("Failed to delete bookmark by feed item ID", e)
}
}
}

View File

@@ -1,102 +0,0 @@
package com.rssuper.repository
import com.rssuper.database.daos.FeedItemDao
import com.rssuper.database.entities.FeedItemEntity
import com.rssuper.model.Error
import com.rssuper.model.State
import com.rssuper.models.Feed
import com.rssuper.models.FeedItem
import com.rssuper.parsing.FeedParser
import com.rssuper.parsing.ParseResult
import com.rssuper.services.FeedFetcher
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.flow.asStateFlow
import kotlinx.coroutines.flow.combine
import java.util.Date
class FeedRepository(
private val feedFetcher: FeedFetcher,
private val feedItemDao: FeedItemDao
) {
private val _feedState = MutableStateFlow<State<Feed>>(State.Idle)
val feedState: StateFlow<State<Feed>> = _feedState.asStateFlow()
private val _feedItemsState = MutableStateFlow<State<List<FeedItemEntity>>>(State.Idle)
val feedItemsState: StateFlow<State<List<FeedItemEntity>>> = _feedItemsState.asStateFlow()
suspend fun fetchFeed(url: String, httpAuth: com.rssuper.services.HTTPAuthCredentials? = null): Boolean {
_feedState.value = State.Loading
val result = feedFetcher.fetchAndParse(url, httpAuth)
return result.fold(
onSuccess = { parseResult ->
when (parseResult) {
is ParseResult.Success -> {
val feed = parseResult.feed
_feedState.value = State.Success(feed)
true
}
is ParseResult.Error -> {
_feedState.value = State.Error(parseResult.message)
false
}
}
},
onFailure = { error ->
_feedState.value = State.Error(
message = error.message ?: "Unknown error",
cause = error
)
false
}
)
}
fun getFeedItems(subscriptionId: String): Flow<State<List<FeedItemEntity>>> {
return feedItemDao.getItemsBySubscription(subscriptionId)
.map { items ->
State.Success(items)
}
}
suspend fun markItemAsRead(itemId: String): Boolean {
return try {
feedItemDao.markAsRead(itemId)
true
} catch (e: Exception) {
_feedItemsState.value = State.Error("Failed to mark item as read", e)
false
}
}
suspend fun markItemAsStarred(itemId: String): Boolean {
return try {
feedItemDao.markAsStarred(itemId)
true
} catch (e: Exception) {
_feedItemsState.value = State.Error("Failed to mark item as starred", e)
false
}
}
fun getStarredItems(): Flow<State<List<FeedItemEntity>>> {
return feedItemDao.getStarredItems()
.map { items ->
State.Success(items)
}
}
fun getUnreadItems(): Flow<State<List<FeedItemEntity>>> {
return feedItemDao.getUnreadItems()
.map { items ->
State.Success(items)
}
}
private fun <T> Flow<List<T>>.map(transform: (List<T>) -> State<List<T>>): Flow<State<List<T>>> {
return this.map { transform(it) }
}
}

View File

@@ -1,32 +0,0 @@
package com.rssuper.repository
import com.rssuper.database.entities.FeedItemEntity
import com.rssuper.database.entities.SubscriptionEntity
import kotlinx.coroutines.flow.Flow
interface FeedRepository {
fun getFeedItems(subscriptionId: String?): Flow<List<FeedItemEntity>>
suspend fun getFeedItemById(id: String): FeedItemEntity?
suspend fun insertFeedItem(item: FeedItemEntity): Long
suspend fun insertFeedItems(items: List<FeedItemEntity>): List<Long>
suspend fun updateFeedItem(item: FeedItemEntity): Int
suspend fun markAsRead(id: String, isRead: Boolean): Int
suspend fun markAsStarred(id: String, isStarred: Boolean): Int
suspend fun deleteFeedItem(id: String): Int
suspend fun getUnreadCount(subscriptionId: String?): Int
}
interface SubscriptionRepository {
fun getAllSubscriptions(): Flow<List<SubscriptionEntity>>
fun getEnabledSubscriptions(): Flow<List<SubscriptionEntity>>
fun getSubscriptionsByCategory(category: String): Flow<List<SubscriptionEntity>>
suspend fun getSubscriptionById(id: String): SubscriptionEntity?
suspend fun getSubscriptionByUrl(url: String): SubscriptionEntity?
suspend fun insertSubscription(subscription: SubscriptionEntity): Long
suspend fun updateSubscription(subscription: SubscriptionEntity): Int
suspend fun deleteSubscription(id: String): Int
suspend fun setEnabled(id: String, enabled: Boolean): Int
suspend fun setError(id: String, error: String?): Int
suspend fun updateLastFetchedAt(id: String, lastFetchedAt: Long): Int
suspend fun updateNextFetchAt(id: String, nextFetchAt: Long): Int
}

View File

@@ -1,210 +0,0 @@
package com.rssuper.repository
import com.rssuper.database.daos.FeedItemDao
import com.rssuper.database.daos.SubscriptionDao
import com.rssuper.database.entities.FeedItemEntity
import com.rssuper.database.entities.SubscriptionEntity
import com.rssuper.state.ErrorDetails
import com.rssuper.state.ErrorType
import com.rssuper.state.State
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.flow.map
class FeedRepositoryImpl(
private val feedItemDao: FeedItemDao
) : FeedRepository {
override fun getFeedItems(subscriptionId: String?): Flow<State<List<FeedItemEntity>>> {
return if (subscriptionId != null) {
feedItemDao.getItemsBySubscription(subscriptionId).map { items ->
State.Success(items)
}.catch { e ->
emit(State.Error("Failed to load feed items", e))
}
} else {
feedItemDao.getUnreadItems().map { items ->
State.Success(items)
}.catch { e ->
emit(State.Error("Failed to load feed items", e))
}
}
}
override suspend fun getFeedItemById(id: String): FeedItemEntity? {
return try {
feedItemDao.getItemById(id)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to get feed item", false)
}
}
override suspend fun insertFeedItem(item: FeedItemEntity): Long {
return try {
feedItemDao.insertItem(item)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to insert feed item", false)
}
}
override suspend fun insertFeedItems(items: List<FeedItemEntity>): List<Long> {
return try {
feedItemDao.insertItems(items)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to insert feed items", false)
}
}
override suspend fun updateFeedItem(item: FeedItemEntity): Int {
return try {
feedItemDao.updateItem(item)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to update feed item", false)
}
}
override suspend fun markAsRead(id: String, isRead: Boolean): Int {
return try {
if (isRead) {
feedItemDao.markAsRead(id)
} else {
feedItemDao.markAsUnread(id)
}
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to mark item as read", true)
}
}
override suspend fun markAsStarred(id: String, isStarred: Boolean): Int {
return try {
if (isStarred) {
feedItemDao.markAsStarred(id)
} else {
feedItemDao.markAsUnstarred(id)
}
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to star item", true)
}
}
override suspend fun deleteFeedItem(id: String): Int {
return try {
feedItemDao.deleteItemById(id)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to delete feed item", false)
}
}
override suspend fun getUnreadCount(subscriptionId: String?): Int {
return try {
if (subscriptionId != null) {
feedItemDao.getItemById(subscriptionId)
feedItemDao.getUnreadCount(subscriptionId).first()
} else {
feedItemDao.getTotalUnreadCount().first()
}
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to get unread count", false)
}
}
}
class SubscriptionRepositoryImpl(
private val subscriptionDao: SubscriptionDao
) : SubscriptionRepository {
override fun getAllSubscriptions(): Flow<State<List<SubscriptionEntity>>> {
return subscriptionDao.getAllSubscriptions().map { subscriptions ->
State.Success(subscriptions)
}.catch { e ->
emit(State.Error("Failed to load subscriptions", e))
}
}
override fun getEnabledSubscriptions(): Flow<State<List<SubscriptionEntity>>> {
return subscriptionDao.getEnabledSubscriptions().map { subscriptions ->
State.Success(subscriptions)
}.catch { e ->
emit(State.Error("Failed to load enabled subscriptions", e))
}
}
override fun getSubscriptionsByCategory(category: String): Flow<State<List<SubscriptionEntity>>> {
return subscriptionDao.getSubscriptionsByCategory(category).map { subscriptions ->
State.Success(subscriptions)
}.catch { e ->
emit(State.Error("Failed to load subscriptions by category", e))
}
}
override suspend fun getSubscriptionById(id: String): SubscriptionEntity? {
return try {
subscriptionDao.getSubscriptionById(id)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to get subscription", false)
}
}
override suspend fun getSubscriptionByUrl(url: String): SubscriptionEntity? {
return try {
subscriptionDao.getSubscriptionByUrl(url)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to get subscription by URL", false)
}
}
override suspend fun insertSubscription(subscription: SubscriptionEntity): Long {
return try {
subscriptionDao.insertSubscription(subscription)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to insert subscription", false)
}
}
override suspend fun updateSubscription(subscription: SubscriptionEntity): Int {
return try {
subscriptionDao.updateSubscription(subscription)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to update subscription", true)
}
}
override suspend fun deleteSubscription(id: String): Int {
return try {
subscriptionDao.deleteSubscriptionById(id)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to delete subscription", false)
}
}
override suspend fun setEnabled(id: String, enabled: Boolean): Int {
return try {
subscriptionDao.setEnabled(id, enabled)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to set subscription enabled state", true)
}
}
override suspend fun setError(id: String, error: String?): Int {
return try {
subscriptionDao.updateError(id, error)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to set subscription error", true)
}
}
override suspend fun updateLastFetchedAt(id: String, lastFetchedAt: Long): Int {
return try {
subscriptionDao.updateLastFetchedAtMillis(id, lastFetchedAt)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to update last fetched time", true)
}
}
override suspend fun updateNextFetchAt(id: String, nextFetchAt: Long): Int {
return try {
subscriptionDao.updateNextFetchAtMillis(id, nextFetchAt)
} catch (e: Exception) {
throw ErrorDetails(ErrorType.DATABASE, "Failed to update next fetch time", true)
}
}
}

View File

@@ -1,156 +0,0 @@
package com.rssuper.repository
import com.rssuper.database.daos.SubscriptionDao
import com.rssuper.database.entities.SubscriptionEntity
import com.rssuper.model.Error
import com.rssuper.model.State
import com.rssuper.models.FeedSubscription
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.flow.asStateFlow
import kotlinx.coroutines.flow.combine
import java.util.Date
class SubscriptionRepository(
private val subscriptionDao: SubscriptionDao
) {
private val _subscriptionsState = MutableStateFlow<State<List<SubscriptionEntity>>>(State.Idle)
val subscriptionsState: StateFlow<State<List<SubscriptionEntity>>> = _subscriptionsState.asStateFlow()
fun getAllSubscriptions(): Flow<State<List<SubscriptionEntity>>> {
return subscriptionDao.getAllSubscriptions()
.map { subscriptions ->
State.Success(subscriptions)
}
}
fun getEnabledSubscriptions(): Flow<State<List<SubscriptionEntity>>> {
return subscriptionDao.getEnabledSubscriptions()
.map { subscriptions ->
State.Success(subscriptions)
}
}
fun getSubscriptionsByCategory(category: String): Flow<State<List<SubscriptionEntity>>> {
return subscriptionDao.getSubscriptionsByCategory(category)
.map { subscriptions ->
State.Success(subscriptions)
}
}
suspend fun getSubscriptionById(id: String): State<SubscriptionEntity?> {
return try {
val subscription = subscriptionDao.getSubscriptionById(id)
State.Success(subscription)
} catch (e: Exception) {
State.Error("Failed to get subscription", e)
}
}
suspend fun getSubscriptionByUrl(url: String): State<SubscriptionEntity?> {
return try {
val subscription = subscriptionDao.getSubscriptionByUrl(url)
State.Success(subscription)
} catch (e: Exception) {
State.Error("Failed to get subscription by URL", e)
}
}
suspend fun addSubscription(subscription: FeedSubscription): Boolean {
return try {
subscriptionDao.insertSubscription(
SubscriptionEntity(
id = subscription.id,
url = subscription.url,
title = subscription.title,
category = subscription.category,
enabled = subscription.enabled,
fetchInterval = subscription.fetchInterval,
createdAt = subscription.createdAt,
updatedAt = subscription.updatedAt,
lastFetchedAt = subscription.lastFetchedAt,
nextFetchAt = subscription.nextFetchAt,
error = subscription.error,
httpAuthUsername = subscription.httpAuth?.username,
httpAuthPassword = subscription.httpAuth?.password
)
)
_subscriptionsState.value = State.Success(emptyList())
true
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to add subscription", e)
false
}
}
suspend fun updateSubscription(subscription: FeedSubscription): Boolean {
return try {
subscriptionDao.updateSubscription(
SubscriptionEntity(
id = subscription.id,
url = subscription.url,
title = subscription.title,
category = subscription.category,
enabled = subscription.enabled,
fetchInterval = subscription.fetchInterval,
createdAt = subscription.createdAt,
updatedAt = subscription.updatedAt,
lastFetchedAt = subscription.lastFetchedAt,
nextFetchAt = subscription.nextFetchAt,
error = subscription.error,
httpAuthUsername = subscription.httpAuth?.username,
httpAuthPassword = subscription.httpAuth?.password
)
)
true
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to update subscription", e)
false
}
}
suspend fun deleteSubscription(id: String): Boolean {
return try {
subscriptionDao.deleteSubscriptionById(id)
true
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to delete subscription", e)
false
}
}
suspend fun updateError(id: String, error: String?): Boolean {
return try {
subscriptionDao.updateError(id, error)
true
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to update subscription error", e)
false
}
}
suspend fun updateLastFetchedAt(id: String, lastFetchedAt: Date): Boolean {
return try {
subscriptionDao.updateLastFetchedAt(id, lastFetchedAt)
true
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to update last fetched at", e)
false
}
}
suspend fun updateNextFetchAt(id: String, nextFetchAt: Date): Boolean {
return try {
subscriptionDao.updateNextFetchAt(id, nextFetchAt)
true
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to update next fetch at", e)
false
}
}
private fun <T> Flow<List<T>>.map(transform: (List<T>) -> State<List<T>>): Flow<State<List<T>>> {
return this.map { transform(it) }
}
}

View File

@@ -1,18 +0,0 @@
package com.rssuper.search
import com.rssuper.models.SearchFilters
/**
* SearchQuery - Represents a search query with filters
*/
data class SearchQuery(
val queryString: String,
val filters: SearchFilters? = null,
val page: Int = 1,
val pageSize: Int = 20,
val timestamp: Long = System.currentTimeMillis()
) {
fun isValid(): Boolean = queryString.isNotEmpty()
fun getCacheKey(): String = "${queryString}_${filters?.hashCode() ?: 0}"
}

View File

@@ -1,16 +0,0 @@
package com.rssuper.search
import com.rssuper.database.entities.FeedItemEntity
/**
* SearchResult - Represents a search result with relevance score
*/
data class SearchResult(
val feedItem: FeedItemEntity,
val relevanceScore: Float,
val highlight: String? = null
) {
fun isHighRelevance(): Boolean = relevanceScore > 0.8f
fun isMediumRelevance(): Boolean = relevanceScore in 0.5f..0.8f
fun isLowRelevance(): Boolean = relevanceScore < 0.5f
}

View File

@@ -1,71 +0,0 @@
package com.rssuper.search
import com.rssuper.database.daos.FeedItemDao
import com.rssuper.database.entities.FeedItemEntity
/**
* SearchResultProvider - Provides search results from the database
*/
class SearchResultProvider(
private val feedItemDao: FeedItemDao
) {
suspend fun search(query: String, limit: Int = 20): List<SearchResult> {
// Use FTS query to search feed items
val results = feedItemDao.searchByFts(query, limit)
return results.mapIndexed { index, item ->
SearchResult(
feedItem = item,
relevanceScore = calculateRelevance(query, item, index),
highlight = generateHighlight(item)
)
}
}
suspend fun searchBySubscription(query: String, subscriptionId: String, limit: Int = 20): List<SearchResult> {
val results = feedItemDao.searchByFts(query, limit)
return results.filter { it.subscriptionId == subscriptionId }.mapIndexed { index, item ->
SearchResult(
feedItem = item,
relevanceScore = calculateRelevance(query, item, index),
highlight = generateHighlight(item)
)
}
}
private fun calculateRelevance(query: String, item: FeedItemEntity, position: Int): Float {
val queryLower = query.lowercase()
var score = 0.0f
// Title match (highest weight)
if (item.title.lowercase().contains(queryLower)) {
score += 1.0f
}
// Author match
if (item.author?.lowercase()?.contains(queryLower) == true) {
score += 0.5f
}
// Position bonus (earlier results are more relevant)
score += (1.0f / (position + 1)) * 0.3f
return score.coerceIn(0.0f, 1.0f)
}
private fun generateHighlight(item: FeedItemEntity): String? {
val maxLength = 200
var text = item.title
if (item.description?.isNotEmpty() == true) {
text += " ${item.description}"
}
if (text.length > maxLength) {
text = text.substring(0, maxLength) + "..."
}
return text
}
}

View File

@@ -1,81 +0,0 @@
package com.rssuper.search
import com.rssuper.database.daos.FeedItemDao
import com.rssuper.database.daos.SearchHistoryDao
import com.rssuper.database.entities.SearchHistoryEntity
import kotlinx.coroutines.flow.Flow
import kotlinx.coroutines.flow.flow
/**
* SearchService - Provides search functionality with FTS
*/
class SearchService(
private val feedItemDao: FeedItemDao,
private val searchHistoryDao: SearchHistoryDao,
private val resultProvider: SearchResultProvider
) {
private val cache = mutableMapOf<String, List<SearchResult>>()
private val maxCacheSize = 100
fun search(query: String): Flow<List<SearchResult>> {
val cacheKey = query.hashCode().toString()
// Return cached results if available
cache[cacheKey]?.let { return flow { emit(it) } }
return flow {
val results = resultProvider.search(query)
cache[cacheKey] = results
if (cache.size > maxCacheSize) {
cache.remove(cache.keys.first())
}
emit(results)
}
}
fun searchBySubscription(query: String, subscriptionId: String): Flow<List<SearchResult>> {
return flow {
val results = resultProvider.searchBySubscription(query, subscriptionId)
emit(results)
}
}
suspend fun searchAndSave(query: String): List<SearchResult> {
val results = resultProvider.search(query)
// Save to search history
saveSearchHistory(query)
return results
}
suspend fun saveSearchHistory(query: String) {
val searchHistory = SearchHistoryEntity(
id = System.currentTimeMillis().toString(),
query = query,
filtersJson = null,
timestamp = System.currentTimeMillis()
)
searchHistoryDao.insertSearchHistory(searchHistory)
}
fun getSearchHistory(): Flow<List<SearchHistoryEntity>> {
return searchHistoryDao.getAllSearchHistory()
}
suspend fun getRecentSearches(limit: Int = 10): List<SearchHistoryEntity> {
return searchHistoryDao.getRecentSearches(limit).firstOrNull() ?: emptyList()
}
suspend fun clearSearchHistory() {
searchHistoryDao.deleteAllSearchHistory()
}
fun getSearchSuggestions(query: String): Flow<List<SearchHistoryEntity>> {
return searchHistoryDao.searchHistory(query)
}
fun clearCache() {
cache.clear()
}
}

View File

@@ -1,174 +0,0 @@
package com.rssuper.services
import com.rssuper.parsing.FeedParser
import com.rssuper.parsing.ParseResult
import okhttp3.Call
import okhttp3.EventListener
import okhttp3.OkHttpClient
import okhttp3.Request
import okhttp3.Response
import java.io.IOException
import java.util.concurrent.TimeUnit
class FeedFetcher(
private val timeoutMs: Long = 15000,
private val maxRetries: Int = 3,
private val baseRetryDelayMs: Long = 1000
) {
private val client: OkHttpClient
init {
val builder = OkHttpClient.Builder()
.connectTimeout(timeoutMs, TimeUnit.MILLISECONDS)
.readTimeout(timeoutMs, TimeUnit.MILLISECONDS)
.writeTimeout(timeoutMs, TimeUnit.MILLISECONDS)
builder.eventListenerFactory { call -> TimeoutEventListener(call) }
client = builder.build()
}
fun fetch(
url: String,
httpAuth: HTTPAuthCredentials? = null,
ifNoneMatch: String? = null,
ifModifiedSince: String? = null
): NetworkResult<FetchResult> {
var lastError: Throwable? = null
for (attempt in 1..maxRetries) {
val result = fetchSingleAttempt(url, httpAuth, ifNoneMatch, ifModifiedSince)
when (result) {
is NetworkResult.Success -> return result
is NetworkResult.Failure -> {
lastError = result.error
if (attempt < maxRetries) {
val delay = calculateBackoffDelay(attempt)
Thread.sleep(delay)
}
}
}
}
return NetworkResult.Failure(lastError ?: NetworkError.Unknown())
}
fun fetchAndParse(url: String, httpAuth: HTTPAuthCredentials? = null): NetworkResult<ParseResult> {
val fetchResult = fetch(url, httpAuth)
return fetchResult.flatMap { result ->
try {
val parseResult = FeedParser.parse(result.feedXml, url)
NetworkResult.Success(parseResult)
} catch (e: Exception) {
NetworkResult.Failure(NetworkError.Unknown(e))
}
}
}
private fun fetchSingleAttempt(
url: String,
httpAuth: HTTPAuthCredentials? = null,
ifNoneMatch: String? = null,
ifModifiedSince: String? = null
): NetworkResult<FetchResult> {
val requestBuilder = Request.Builder()
.url(url)
.addHeader("User-Agent", "RSSuper/1.0")
ifNoneMatch?.let { requestBuilder.addHeader("If-None-Match", it) }
ifModifiedSince?.let { requestBuilder.addHeader("If-Modified-Since", it) }
httpAuth?.let {
requestBuilder.addHeader("Authorization", it.toCredentials())
}
val request = requestBuilder.build()
return try {
val response = client.newCall(request).execute()
handleResponse(response, url)
} catch (e: IOException) {
NetworkResult.Failure(NetworkError.Unknown(e))
} catch (e: Exception) {
NetworkResult.Failure(NetworkError.Unknown(e))
}
}
private fun handleResponse(response: Response, url: String): NetworkResult<FetchResult> {
try {
val body = response.body
return when (response.code) {
200 -> {
if (body != null) {
NetworkResult.Success(FetchResult.fromResponse(response, url, response.cacheResponse != null))
} else {
NetworkResult.Failure(NetworkError.Http(response.code, "Empty response body"))
}
}
304 -> {
if (body != null) {
NetworkResult.Success(FetchResult.fromResponse(response, url, true))
} else {
NetworkResult.Failure(NetworkError.Http(response.code, "Empty response body"))
}
}
in 400..499 -> {
NetworkResult.Failure(NetworkError.Http(response.code, "Client error: ${response.message}"))
}
in 500..599 -> {
NetworkResult.Failure(NetworkError.Http(response.code, "Server error: ${response.message}"))
}
else -> {
NetworkResult.Failure(NetworkError.Http(response.code, "Unexpected status code: ${response.code}"))
}
}
} finally {
response.close()
}
}
private fun calculateBackoffDelay(attempt: Int): Long {
var delay = baseRetryDelayMs
for (i in 1 until attempt) {
delay *= 2
}
return delay
}
private class TimeoutEventListener(private val call: Call) : EventListener() {
override fun callStart(call: Call) {
}
override fun callEnd(call: Call) {
}
override fun callFailed(call: Call, ioe: IOException) {
}
}
sealed class NetworkResult<out T> {
data class Success<T>(val value: T) : NetworkResult<T>()
data class Failure<T>(val error: Throwable) : NetworkResult<T>()
fun isSuccess(): Boolean = this is Success
fun isFailure(): Boolean = this is Failure
fun getOrNull(): T? = when (this) {
is Success -> value
is Failure -> null
}
fun <R> map(transform: (T) -> R): NetworkResult<R> = when (this) {
is Success -> Success(transform(value))
is Failure -> Failure(error)
}
fun <R> flatMap(transform: (T) -> NetworkResult<R>): NetworkResult<R> = when (this) {
is Success -> transform(value)
is Failure -> Failure(error)
}
}
}

View File

@@ -1,31 +0,0 @@
package com.rssuper.services
import okhttp3.CacheControl
import okhttp3.Response
data class FetchResult(
val feedXml: String,
val url: String,
val cacheControl: CacheControl?,
val isCached: Boolean,
val etag: String? = null,
val lastModified: String? = null
) {
companion object {
fun fromResponse(response: Response, url: String, isCached: Boolean = false): FetchResult {
val body = response.body?.string() ?: ""
val cacheControl = response.cacheControl
val etag = response.header("ETag")
val lastModified = response.header("Last-Modified")
return FetchResult(
feedXml = body,
url = url,
cacheControl = cacheControl,
isCached = isCached,
etag = etag,
lastModified = lastModified
)
}
}
}

View File

@@ -1,12 +0,0 @@
package com.rssuper.services
import okhttp3.Credentials
data class HTTPAuthCredentials(
val username: String,
val password: String
) {
fun toCredentials(): String {
return Credentials.basic(username, password)
}
}

View File

@@ -1,7 +0,0 @@
package com.rssuper.services
sealed class NetworkError(message: String? = null, cause: Throwable? = null) : Exception(message, cause) {
data class Http(val statusCode: Int, override val message: String) : NetworkError(message)
data class Timeout(val durationMs: Long) : NetworkError("Timeout")
data class Unknown(override val cause: Throwable? = null) : NetworkError(cause = cause)
}

View File

@@ -1,10 +0,0 @@
package com.rssuper.state
import com.rssuper.database.entities.BookmarkEntity
sealed interface BookmarkState {
data object Idle : BookmarkState
data object Loading : BookmarkState
data class Success(val data: List<BookmarkEntity>) : BookmarkState
data class Error(val message: String, val cause: Throwable? = null) : BookmarkState
}

View File

@@ -1,15 +0,0 @@
package com.rssuper.state
enum class ErrorType {
NETWORK,
DATABASE,
PARSING,
AUTH,
UNKNOWN
}
data class ErrorDetails(
val type: ErrorType,
val message: String,
val retryable: Boolean = false
)

View File

@@ -1,8 +0,0 @@
package com.rssuper.state
sealed interface State<out T> {
data object Idle : State<Nothing>
data object Loading : State<Nothing>
data class Success<T>(val data: T) : State<T>
data class Error(val message: String, val cause: Throwable? = null) : State<Nothing>
}

View File

@@ -1,67 +0,0 @@
package com.rssuper.viewmodel
import androidx.lifecycle.ViewModel
import androidx.lifecycle.viewModelScope
import com.rssuper.repository.FeedRepository
import com.rssuper.state.State
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.flow.asStateFlow
import kotlinx.coroutines.launch
class FeedViewModel(
private val feedRepository: FeedRepository
) : ViewModel() {
private val _feedState = MutableStateFlow<State<List<com.rssuper.database.entities.FeedItemEntity>>>(State.Idle)
val feedState: StateFlow<State<List<com.rssuper.database.entities.FeedItemEntity>>> = _feedState.asStateFlow()
private val _unreadCount = MutableStateFlow<State<Int>>(State.Idle)
val unreadCount: StateFlow<State<Int>> = _unreadCount.asStateFlow()
fun loadFeedItems(subscriptionId: String? = null) {
viewModelScope.launch {
feedRepository.getFeedItems(subscriptionId).collect { state ->
_feedState.value = state
}
}
}
fun loadUnreadCount(subscriptionId: String? = null) {
viewModelScope.launch {
_unreadCount.value = State.Loading
try {
val count = feedRepository.getUnreadCount(subscriptionId)
_unreadCount.value = State.Success(count)
} catch (e: Exception) {
_unreadCount.value = State.Error("Failed to load unread count", e)
}
}
}
fun markAsRead(id: String, isRead: Boolean) {
viewModelScope.launch {
try {
feedRepository.markAsRead(id, isRead)
loadUnreadCount()
} catch (e: Exception) {
_unreadCount.value = State.Error("Failed to update read state", e)
}
}
}
fun markAsStarred(id: String, isStarred: Boolean) {
viewModelScope.launch {
try {
feedRepository.markAsStarred(id, isStarred)
} catch (e: Exception) {
_feedState.value = State.Error("Failed to update starred state", e)
}
}
}
fun refreshFeed(subscriptionId: String? = null) {
loadFeedItems(subscriptionId)
loadUnreadCount(subscriptionId)
}
}

View File

@@ -1,83 +0,0 @@
package com.rssuper.viewmodel
import androidx.lifecycle.ViewModel
import androidx.lifecycle.viewModelScope
import com.rssuper.repository.SubscriptionRepository
import com.rssuper.state.State
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.flow.asStateFlow
import kotlinx.coroutines.launch
class SubscriptionViewModel(
private val subscriptionRepository: SubscriptionRepository
) : ViewModel() {
private val _subscriptionsState = MutableStateFlow<State<List<com.rssuper.database.entities.SubscriptionEntity>>>(State.Idle)
val subscriptionsState: StateFlow<State<List<com.rssuper.database.entities.SubscriptionEntity>>> = _subscriptionsState.asStateFlow()
private val _enabledSubscriptionsState = MutableStateFlow<State<List<com.rssuper.database.entities.SubscriptionEntity>>>(State.Idle)
val enabledSubscriptionsState: StateFlow<State<List<com.rssuper.database.entities.SubscriptionEntity>>> = _enabledSubscriptionsState.asStateFlow()
fun loadAllSubscriptions() {
viewModelScope.launch {
subscriptionRepository.getAllSubscriptions().collect { state ->
_subscriptionsState.value = state
}
}
}
fun loadEnabledSubscriptions() {
viewModelScope.launch {
subscriptionRepository.getEnabledSubscriptions().collect { state ->
_enabledSubscriptionsState.value = state
}
}
}
fun setEnabled(id: String, enabled: Boolean) {
viewModelScope.launch {
try {
subscriptionRepository.setEnabled(id, enabled)
loadEnabledSubscriptions()
} catch (e: Exception) {
_enabledSubscriptionsState.value = State.Error("Failed to update subscription enabled state", e)
}
}
}
fun setError(id: String, error: String?) {
viewModelScope.launch {
try {
subscriptionRepository.setError(id, error)
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to set subscription error", e)
}
}
}
fun updateLastFetchedAt(id: String, lastFetchedAt: Long) {
viewModelScope.launch {
try {
subscriptionRepository.updateLastFetchedAt(id, lastFetchedAt)
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to update last fetched time", e)
}
}
}
fun updateNextFetchAt(id: String, nextFetchAt: Long) {
viewModelScope.launch {
try {
subscriptionRepository.updateNextFetchAt(id, nextFetchAt)
} catch (e: Exception) {
_subscriptionsState.value = State.Error("Failed to update next fetch time", e)
}
}
}
fun refreshSubscriptions() {
loadAllSubscriptions()
loadEnabledSubscriptions()
}
}

View File

@@ -1,294 +0,0 @@
package com.rssuper.database
import android.content.Context
import androidx.room.Room
import androidx.test.core.app.ApplicationProvider
import com.rssuper.database.daos.FeedItemDao
import com.rssuper.database.entities.FeedItemEntity
import kotlinx.coroutines.flow.first
import kotlinx.coroutines.test.runTest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Test
import java.util.Date
class FeedItemDaoTest {
private lateinit var database: RssDatabase
private lateinit var dao: FeedItemDao
@Before
fun createDb() {
val context = ApplicationProvider.getApplicationContext<Context>()
database = Room.inMemoryDatabaseBuilder(
context,
RssDatabase::class.java
)
.allowMainThreadQueries()
.build()
dao = database.feedItemDao()
}
@After
fun closeDb() {
database.close()
}
@Test
fun insertAndGetItem() = runTest {
val item = createTestItem("1", "sub1")
dao.insertItem(item)
val result = dao.getItemById("1")
assertNotNull(result)
assertEquals("1", result?.id)
assertEquals("Test Item", result?.title)
}
@Test
fun getItemsBySubscription() = runTest {
val item1 = createTestItem("1", "sub1")
val item2 = createTestItem("2", "sub1")
val item3 = createTestItem("3", "sub2")
dao.insertItems(listOf(item1, item2, item3))
val result = dao.getItemsBySubscription("sub1").first()
assertEquals(2, result.size)
}
@Test
fun getItemsBySubscriptions() = runTest {
val item1 = createTestItem("1", "sub1")
val item2 = createTestItem("2", "sub2")
val item3 = createTestItem("3", "sub3")
dao.insertItems(listOf(item1, item2, item3))
val result = dao.getItemsBySubscriptions(listOf("sub1", "sub2")).first()
assertEquals(2, result.size)
}
@Test
fun getUnreadItems() = runTest {
val unread = createTestItem("1", "sub1", isRead = false)
val read = createTestItem("2", "sub1", isRead = true)
dao.insertItems(listOf(unread, read))
val result = dao.getUnreadItems().first()
assertEquals(1, result.size)
assertEquals("1", result[0].id)
}
@Test
fun getStarredItems() = runTest {
val starred = createTestItem("1", "sub1", isStarred = true)
val notStarred = createTestItem("2", "sub1", isStarred = false)
dao.insertItems(listOf(starred, notStarred))
val result = dao.getStarredItems().first()
assertEquals(1, result.size)
assertEquals("1", result[0].id)
}
@Test
fun getItemsAfterDate() = runTest {
val oldDate = Date(System.currentTimeMillis() - 86400000 * 2)
val newDate = Date(System.currentTimeMillis() - 86400000)
val today = Date()
val oldItem = createTestItem("1", "sub1", published = oldDate)
val newItem = createTestItem("2", "sub1", published = newDate)
val todayItem = createTestItem("3", "sub1", published = today)
dao.insertItems(listOf(oldItem, newItem, todayItem))
val result = dao.getItemsAfterDate(newDate).first()
assertEquals(1, result.size)
assertEquals("3", result[0].id)
}
@Test
fun getUnreadCount() = runTest {
val unread1 = createTestItem("1", "sub1", isRead = false)
val unread2 = createTestItem("2", "sub1", isRead = false)
val read = createTestItem("3", "sub1", isRead = true)
dao.insertItems(listOf(unread1, unread2, read))
val count = dao.getUnreadCount("sub1").first()
assertEquals(2, count)
}
@Test
fun getTotalUnreadCount() = runTest {
val unread1 = createTestItem("1", "sub1", isRead = false)
val unread2 = createTestItem("2", "sub2", isRead = false)
val read = createTestItem("3", "sub1", isRead = true)
dao.insertItems(listOf(unread1, unread2, read))
val count = dao.getTotalUnreadCount().first()
assertEquals(2, count)
}
@Test
fun updateItem() = runTest {
val item = createTestItem("1", "sub1")
dao.insertItem(item)
val updated = item.copy(title = "Updated Title")
dao.updateItem(updated)
val result = dao.getItemById("1")
assertEquals("Updated Title", result?.title)
}
@Test
fun deleteItem() = runTest {
val item = createTestItem("1", "sub1")
dao.insertItem(item)
dao.deleteItem(item)
val result = dao.getItemById("1")
assertNull(result)
}
@Test
fun deleteItemById() = runTest {
val item = createTestItem("1", "sub1")
dao.insertItem(item)
dao.deleteItemById("1")
val result = dao.getItemById("1")
assertNull(result)
}
@Test
fun deleteItemsBySubscription() = runTest {
val item1 = createTestItem("1", "sub1")
val item2 = createTestItem("2", "sub1")
val item3 = createTestItem("3", "sub2")
dao.insertItems(listOf(item1, item2, item3))
dao.deleteItemsBySubscription("sub1")
val sub1Items = dao.getItemsBySubscription("sub1").first()
val sub2Items = dao.getItemsBySubscription("sub2").first()
assertEquals(0, sub1Items.size)
assertEquals(1, sub2Items.size)
}
@Test
fun markAsRead() = runTest {
val item = createTestItem("1", "sub1", isRead = false)
dao.insertItem(item)
dao.markAsRead("1")
val result = dao.getItemById("1")
assertEquals(true, result?.isRead)
}
@Test
fun markAsUnread() = runTest {
val item = createTestItem("1", "sub1", isRead = true)
dao.insertItem(item)
dao.markAsUnread("1")
val result = dao.getItemById("1")
assertEquals(false, result?.isRead)
}
@Test
fun markAsStarred() = runTest {
val item = createTestItem("1", "sub1", isStarred = false)
dao.insertItem(item)
dao.markAsStarred("1")
val result = dao.getItemById("1")
assertEquals(true, result?.isStarred)
}
@Test
fun markAsUnstarred() = runTest {
val item = createTestItem("1", "sub1", isStarred = true)
dao.insertItem(item)
dao.markAsUnstarred("1")
val result = dao.getItemById("1")
assertEquals(false, result?.isStarred)
}
@Test
fun markAllAsRead() = runTest {
val item1 = createTestItem("1", "sub1", isRead = false)
val item2 = createTestItem("2", "sub1", isRead = false)
val item3 = createTestItem("3", "sub2", isRead = false)
dao.insertItems(listOf(item1, item2, item3))
dao.markAllAsRead("sub1")
val sub1Items = dao.getItemsBySubscription("sub1").first()
val sub2Items = dao.getItemsBySubscription("sub2").first()
assertEquals(true, sub1Items[0].isRead)
assertEquals(true, sub1Items[1].isRead)
assertEquals(false, sub2Items[0].isRead)
}
@Test
fun getItemsPaginated() = runTest {
for (i in 1..10) {
val item = createTestItem(i.toString(), "sub1")
dao.insertItem(item)
}
val firstPage = dao.getItemsPaginated("sub1", 5, 0)
val secondPage = dao.getItemsPaginated("sub1", 5, 5)
assertEquals(5, firstPage.size)
assertEquals(5, secondPage.size)
}
private fun createTestItem(
id: String,
subscriptionId: String,
title: String = "Test Item",
isRead: Boolean = false,
isStarred: Boolean = false,
published: Date = Date()
): FeedItemEntity {
return FeedItemEntity(
id = id,
subscriptionId = subscriptionId,
title = title,
link = "https://example.com/$id",
description = "Test description",
content = "Test content",
author = "Test Author",
published = published,
updated = published,
categories = "Tech,News",
enclosureUrl = null,
enclosureType = null,
enclosureLength = null,
guid = "guid-$id",
isRead = isRead,
isStarred = isStarred
)
}
}

View File

@@ -1,196 +0,0 @@
package com.rssuper.database
import android.content.Context
import androidx.room.Room
import androidx.test.core.app.ApplicationProvider
import com.rssuper.database.entities.FeedItemEntity
import com.rssuper.database.entities.SearchHistoryEntity
import com.rssuper.database.entities.SubscriptionEntity
import kotlinx.coroutines.flow.first
import kotlinx.coroutines.test.runTest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Before
import org.junit.Test
import java.util.Date
import java.util.UUID
class RssDatabaseTest {
private lateinit var database: RssDatabase
@Before
fun createDb() {
val context = ApplicationProvider.getApplicationContext<Context>()
database = Room.inMemoryDatabaseBuilder(
context,
RssDatabase::class.java
)
.allowMainThreadQueries()
.build()
}
@After
fun closeDb() {
database.close()
}
@Test
fun databaseConstruction() {
assertNotNull(database.subscriptionDao())
assertNotNull(database.feedItemDao())
assertNotNull(database.searchHistoryDao())
}
@Test
fun ftsVirtualTableExists() {
val cursor = database.run {
openHelper.writableDatabase.query(
"SELECT name FROM sqlite_master WHERE type='table' AND name='feed_items_fts'",
emptyArray()
)
}
assertEquals(true, cursor.moveToFirst())
cursor.close()
}
@Test
fun subscriptionEntityRoundTrip() = runTest {
val now = Date()
val subscription = SubscriptionEntity(
id = UUID.randomUUID().toString(),
url = "https://example.com/feed",
title = "Test Feed",
category = "Tech",
enabled = true,
fetchInterval = 3600000,
createdAt = now,
updatedAt = now,
lastFetchedAt = null,
nextFetchAt = null,
error = null,
httpAuthUsername = null,
httpAuthPassword = null
)
database.subscriptionDao().insertSubscription(subscription)
val result = database.subscriptionDao().getSubscriptionById(subscription.id)
assertNotNull(result)
assertEquals(subscription.id, result?.id)
assertEquals(subscription.title, result?.title)
}
@Test
fun feedItemEntityRoundTrip() = runTest {
val now = Date()
val subscription = SubscriptionEntity(
id = "sub1",
url = "https://example.com/feed",
title = "Test Feed",
category = "Tech",
enabled = true,
fetchInterval = 3600000,
createdAt = now,
updatedAt = now,
lastFetchedAt = null,
nextFetchAt = null,
error = null,
httpAuthUsername = null,
httpAuthPassword = null
)
database.subscriptionDao().insertSubscription(subscription)
val item = FeedItemEntity(
id = UUID.randomUUID().toString(),
subscriptionId = "sub1",
title = "Test Item",
link = "https://example.com/item",
description = "Test description",
content = "Test content",
author = "Test Author",
published = now,
updated = now,
categories = "Tech",
enclosureUrl = null,
enclosureType = null,
enclosureLength = null,
guid = "guid-1",
isRead = false,
isStarred = false
)
database.feedItemDao().insertItem(item)
val result = database.feedItemDao().getItemById(item.id)
assertNotNull(result)
assertEquals(item.id, result?.id)
assertEquals(item.title, result?.title)
assertEquals("sub1", result?.subscriptionId)
}
@Test
fun searchHistoryEntityRoundTrip() = runTest {
val now = Date()
val search = SearchHistoryEntity(
id = UUID.randomUUID().toString(),
query = "kotlin coroutines",
timestamp = now
)
database.searchHistoryDao().insertSearchHistory(search)
val result = database.searchHistoryDao().getSearchHistoryById(search.id)
assertNotNull(result)
assertEquals(search.id, result?.id)
assertEquals(search.query, result?.query)
}
@Test
fun cascadeDeleteFeedItems() = runTest {
val now = Date()
val subscription = SubscriptionEntity(
id = "sub1",
url = "https://example.com/feed",
title = "Test Feed",
category = "Tech",
enabled = true,
fetchInterval = 3600000,
createdAt = now,
updatedAt = now,
lastFetchedAt = null,
nextFetchAt = null,
error = null,
httpAuthUsername = null,
httpAuthPassword = null
)
database.subscriptionDao().insertSubscription(subscription)
val item = FeedItemEntity(
id = "item1",
subscriptionId = "sub1",
title = "Test Item",
link = "https://example.com/item",
description = "Test description",
content = "Test content",
author = "Test Author",
published = now,
updated = now,
categories = "Tech",
enclosureUrl = null,
enclosureType = null,
enclosureLength = null,
guid = "guid-1",
isRead = false,
isStarred = false
)
database.feedItemDao().insertItem(item)
database.subscriptionDao().deleteSubscription(subscription)
val items = database.feedItemDao().getItemsBySubscription("sub1").first()
assertEquals(0, items.size)
}
}

View File

@@ -1,188 +0,0 @@
package com.rssuper.database
import android.content.Context
import androidx.room.Room
import androidx.test.core.app.ApplicationProvider
import com.rssuper.database.daos.SearchHistoryDao
import com.rssuper.database.entities.SearchHistoryEntity
import kotlinx.coroutines.flow.first
import kotlinx.coroutines.test.runTest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Test
import java.util.Date
class SearchHistoryDaoTest {
private lateinit var database: RssDatabase
private lateinit var dao: SearchHistoryDao
@Before
fun createDb() {
val context = ApplicationProvider.getApplicationContext<Context>()
database = Room.inMemoryDatabaseBuilder(
context,
RssDatabase::class.java
)
.allowMainThreadQueries()
.build()
dao = database.searchHistoryDao()
}
@After
fun closeDb() {
database.close()
}
@Test
fun insertAndGetSearchHistory() = runTest {
val search = createTestSearch("1", "kotlin")
dao.insertSearchHistory(search)
val result = dao.getSearchHistoryById("1")
assertNotNull(result)
assertEquals("1", result?.id)
assertEquals("kotlin", result?.query)
}
@Test
fun getAllSearchHistory() = runTest {
val search1 = createTestSearch("1", "kotlin")
val search2 = createTestSearch("2", "android")
val search3 = createTestSearch("3", "room database")
dao.insertSearchHistories(listOf(search1, search2, search3))
val result = dao.getAllSearchHistory().first()
assertEquals(3, result.size)
}
@Test
fun searchHistory() = runTest {
val search1 = createTestSearch("1", "kotlin coroutines")
val search2 = createTestSearch("2", "android kotlin")
val search3 = createTestSearch("3", "java")
dao.insertSearchHistories(listOf(search1, search2, search3))
val result = dao.searchHistory("kotlin").first()
assertEquals(2, result.size)
}
@Test
fun getRecentSearches() = runTest {
val search1 = createTestSearch("1", "query1", timestamp = Date(System.currentTimeMillis() - 300000))
val search2 = createTestSearch("2", "query2", timestamp = Date(System.currentTimeMillis() - 200000))
val search3 = createTestSearch("3", "query3", timestamp = Date(System.currentTimeMillis() - 100000))
dao.insertSearchHistories(listOf(search1, search2, search3))
val result = dao.getRecentSearches(2).first()
assertEquals(2, result.size)
assertEquals("3", result[0].id)
assertEquals("2", result[1].id)
}
@Test
fun getSearchHistoryCount() = runTest {
val search1 = createTestSearch("1", "query1")
val search2 = createTestSearch("2", "query2")
val search3 = createTestSearch("3", "query3")
dao.insertSearchHistories(listOf(search1, search2, search3))
val count = dao.getSearchHistoryCount().first()
assertEquals(3, count)
}
@Test
fun updateSearchHistory() = runTest {
val search = createTestSearch("1", "old query")
dao.insertSearchHistory(search)
val updated = search.copy(query = "new query")
dao.updateSearchHistory(updated)
val result = dao.getSearchHistoryById("1")
assertEquals("new query", result?.query)
}
@Test
fun deleteSearchHistory() = runTest {
val search = createTestSearch("1", "kotlin")
dao.insertSearchHistory(search)
dao.deleteSearchHistory(search)
val result = dao.getSearchHistoryById("1")
assertNull(result)
}
@Test
fun deleteSearchHistoryById() = runTest {
val search = createTestSearch("1", "kotlin")
dao.insertSearchHistory(search)
dao.deleteSearchHistoryById("1")
val result = dao.getSearchHistoryById("1")
assertNull(result)
}
@Test
fun deleteAllSearchHistory() = runTest {
val search1 = createTestSearch("1", "query1")
val search2 = createTestSearch("2", "query2")
dao.insertSearchHistories(listOf(search1, search2))
dao.deleteAllSearchHistory()
val result = dao.getAllSearchHistory().first()
assertEquals(0, result.size)
}
@Test
fun deleteSearchHistoryOlderThan() = runTest {
val oldSearch = createTestSearch("1", "old query", timestamp = Date(System.currentTimeMillis() - 86400000 * 2))
val recentSearch = createTestSearch("2", "recent query", timestamp = Date(System.currentTimeMillis() - 86400000))
dao.insertSearchHistories(listOf(oldSearch, recentSearch))
dao.deleteSearchHistoryOlderThan(System.currentTimeMillis() - 86400000)
val result = dao.getAllSearchHistory().first()
assertEquals(1, result.size)
assertEquals("2", result[0].id)
}
@Test
fun insertSearchHistoryWithConflict() = runTest {
val search = createTestSearch("1", "kotlin")
dao.insertSearchHistory(search)
val duplicate = search.copy(query = "android")
val result = dao.insertSearchHistory(duplicate)
assertEquals(-1L, result)
val dbSearch = dao.getSearchHistoryById("1")
assertEquals("kotlin", dbSearch?.query)
}
private fun createTestSearch(
id: String,
query: String,
timestamp: Date = Date()
): SearchHistoryEntity {
return SearchHistoryEntity(
id = id,
query = query,
timestamp = timestamp
)
}
}

View File

@@ -1,204 +0,0 @@
package com.rssuper.database
import android.content.Context
import androidx.room.Room
import androidx.test.core.app.ApplicationProvider
import com.rssuper.database.daos.SubscriptionDao
import com.rssuper.database.entities.SubscriptionEntity
import kotlinx.coroutines.flow.first
import kotlinx.coroutines.test.runTest
import org.junit.After
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Test
import java.util.Date
class SubscriptionDaoTest {
private lateinit var database: RssDatabase
private lateinit var dao: SubscriptionDao
@Before
fun createDb() {
val context = ApplicationProvider.getApplicationContext<Context>()
database = Room.inMemoryDatabaseBuilder(
context,
RssDatabase::class.java
)
.allowMainThreadQueries()
.build()
dao = database.subscriptionDao()
}
@After
fun closeDb() {
database.close()
}
@Test
fun insertAndGetSubscription() = runTest {
val subscription = createTestSubscription("1")
dao.insertSubscription(subscription)
val result = dao.getSubscriptionById("1")
assertNotNull(result)
assertEquals("1", result?.id)
assertEquals("Test Feed", result?.title)
}
@Test
fun getSubscriptionByUrl() = runTest {
val subscription = createTestSubscription("1", url = "https://example.com/feed")
dao.insertSubscription(subscription)
val result = dao.getSubscriptionByUrl("https://example.com/feed")
assertNotNull(result)
assertEquals("1", result?.id)
}
@Test
fun getAllSubscriptions() = runTest {
val subscription1 = createTestSubscription("1")
val subscription2 = createTestSubscription("2")
dao.insertSubscriptions(listOf(subscription1, subscription2))
val result = dao.getAllSubscriptions().first()
assertEquals(2, result.size)
}
@Test
fun getEnabledSubscriptions() = runTest {
val enabled = createTestSubscription("1", enabled = true)
val disabled = createTestSubscription("2", enabled = false)
dao.insertSubscriptions(listOf(enabled, disabled))
val result = dao.getEnabledSubscriptions().first()
assertEquals(1, result.size)
assertEquals("1", result[0].id)
}
@Test
fun updateSubscription() = runTest {
val subscription = createTestSubscription("1")
dao.insertSubscription(subscription)
val updated = subscription.copy(title = "Updated Title")
dao.updateSubscription(updated)
val result = dao.getSubscriptionById("1")
assertEquals("Updated Title", result?.title)
}
@Test
fun deleteSubscription() = runTest {
val subscription = createTestSubscription("1")
dao.insertSubscription(subscription)
dao.deleteSubscription(subscription)
val result = dao.getSubscriptionById("1")
assertNull(result)
}
@Test
fun deleteSubscriptionById() = runTest {
val subscription = createTestSubscription("1")
dao.insertSubscription(subscription)
dao.deleteSubscriptionById("1")
val result = dao.getSubscriptionById("1")
assertNull(result)
}
@Test
fun getSubscriptionCount() = runTest {
val subscription1 = createTestSubscription("1")
val subscription2 = createTestSubscription("2")
dao.insertSubscriptions(listOf(subscription1, subscription2))
val count = dao.getSubscriptionCount().first()
assertEquals(2, count)
}
@Test
fun updateError() = runTest {
val subscription = createTestSubscription("1")
dao.insertSubscription(subscription)
dao.updateError("1", "Feed not found")
val result = dao.getSubscriptionById("1")
assertEquals("Feed not found", result?.error)
}
@Test
fun updateLastFetchedAt() = runTest {
val subscription = createTestSubscription("1")
val now = Date()
dao.insertSubscription(subscription)
dao.updateLastFetchedAt("1", now)
val result = dao.getSubscriptionById("1")
assertEquals(now, result?.lastFetchedAt)
assertNull(result?.error)
}
@Test
fun updateNextFetchAt() = runTest {
val subscription = createTestSubscription("1")
val future = Date(System.currentTimeMillis() + 3600000)
dao.insertSubscription(subscription)
dao.updateNextFetchAt("1", future)
val result = dao.getSubscriptionById("1")
assertEquals(future, result?.nextFetchAt)
}
@Test
fun insertSubscriptionWithConflict() = runTest {
val subscription = createTestSubscription("1")
dao.insertSubscription(subscription)
val updated = subscription.copy(title = "Updated")
dao.insertSubscription(updated)
val result = dao.getSubscriptionById("1")
assertEquals("Updated", result?.title)
}
private fun createTestSubscription(
id: String,
url: String = "https://example.com/feed/$id",
title: String = "Test Feed",
enabled: Boolean = true
): SubscriptionEntity {
val now = Date()
return SubscriptionEntity(
id = id,
url = url,
title = title,
category = "Tech",
enabled = enabled,
fetchInterval = 3600000,
createdAt = now,
updatedAt = now,
lastFetchedAt = null,
nextFetchAt = null,
error = null,
httpAuthUsername = null,
httpAuthPassword = null
)
}
}

View File

@@ -1,134 +0,0 @@
package com.rssuper.models
import com.squareup.moshi.Moshi
import com.squareup.moshi.kotlin.reflect.KotlinJsonAdapterFactory
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Test
import java.util.Date
class FeedItemTest {
private lateinit var moshi: Moshi
private lateinit var adapter: com.squareup.moshi.JsonAdapter<FeedItem>
@Before
fun setup() {
moshi = Moshi.Builder()
.add(KotlinJsonAdapterFactory())
.build()
adapter = moshi.adapter(FeedItem::class.java)
}
@Test
fun testSerialization() {
val feedItem = FeedItem(
id = "item-1",
title = "Test Article",
link = "https://example.com/article",
description = "Short description",
content = "Full content here",
author = "John Doe",
published = Date(1672531200000),
categories = listOf("Tech", "News"),
guid = "guid-123",
subscriptionTitle = "Tech News"
)
val json = adapter.toJson(feedItem)
assertNotNull(json)
}
@Test
fun testDeserialization() {
val json = """{
"id": "item-1",
"title": "Test Article",
"link": "https://example.com/article",
"description": "Short description",
"author": "John Doe",
"published": 1672531200000,
"categories": ["Tech", "News"],
"guid": "guid-123",
"subscriptionTitle": "Tech News"
}"""
val feedItem = adapter.fromJson(json)
assertNotNull(feedItem)
assertEquals("item-1", feedItem?.id)
assertEquals("Test Article", feedItem?.title)
assertEquals("John Doe", feedItem?.author)
}
@Test
fun testOptionalFieldsNull() {
val json = """{
"id": "item-1",
"title": "Test Article"
}"""
val feedItem = adapter.fromJson(json)
assertNotNull(feedItem)
assertNull(feedItem?.link)
assertNull(feedItem?.description)
assertNull(feedItem?.author)
}
@Test
fun testEnclosureSerialization() {
val feedItem = FeedItem(
id = "item-1",
title = "Podcast Episode",
enclosure = Enclosure(
url = "https://example.com/episode.mp3",
type = "audio/mpeg",
length = 12345678
)
)
val json = adapter.toJson(feedItem)
assertNotNull(json)
}
@Test
fun testCopy() {
val original = FeedItem(
id = "item-1",
title = "Original Title",
author = "Original Author"
)
val modified = original.copy(title = "Modified Title")
assertEquals("item-1", modified.id)
assertEquals("Modified Title", modified.title)
assertEquals("Original Author", modified.author)
}
@Test
fun testEqualsAndHashCode() {
val item1 = FeedItem(id = "item-1", title = "Test")
val item2 = FeedItem(id = "item-1", title = "Test")
val item3 = FeedItem(id = "item-2", title = "Test")
assertEquals(item1, item2)
assertEquals(item1.hashCode(), item2.hashCode())
assert(item1 != item3)
}
@Test
fun testToString() {
val feedItem = FeedItem(
id = "item-1",
title = "Test Article",
author = "John Doe"
)
val toString = feedItem.toString()
assertNotNull(toString)
assert(toString.contains("id=item-1"))
assert(toString.contains("title=Test Article"))
}
}

View File

@@ -1,199 +0,0 @@
package com.rssuper.models
import com.squareup.moshi.Moshi
import com.squareup.moshi.kotlin.reflect.KotlinJsonAdapterFactory
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Test
import java.util.Date
class FeedSubscriptionTest {
private lateinit var moshi: Moshi
private lateinit var adapter: com.squareup.moshi.JsonAdapter<FeedSubscription>
@Before
fun setup() {
moshi = Moshi.Builder()
.add(KotlinJsonAdapterFactory())
.build()
adapter = moshi.adapter(FeedSubscription::class.java)
}
@Test
fun testSerialization() {
val subscription = FeedSubscription(
id = "sub-1",
url = "https://example.com/feed.xml",
title = "Tech News",
category = "Technology",
enabled = true,
fetchInterval = 60,
createdAt = Date(1672531200000),
updatedAt = Date(1672617600000)
)
val json = adapter.toJson(subscription)
assertNotNull(json)
}
@Test
fun testDeserialization() {
val json = """{
"id": "sub-1",
"url": "https://example.com/feed.xml",
"title": "Tech News",
"category": "Technology",
"enabled": true,
"fetchInterval": 60,
"createdAt": 1672531200000,
"updatedAt": 1672617600000
}"""
val subscription = adapter.fromJson(json)
assertNotNull(subscription)
assertEquals("sub-1", subscription?.id)
assertEquals("https://example.com/feed.xml", subscription?.url)
assertEquals("Tech News", subscription?.title)
assertEquals("Technology", subscription?.category)
assertEquals(true, subscription?.enabled)
assertEquals(60, subscription?.fetchInterval)
}
@Test
fun testOptionalFieldsNull() {
val json = """{
"id": "sub-1",
"url": "https://example.com/feed.xml",
"title": "Tech News",
"enabled": true,
"fetchInterval": 60,
"createdAt": 1672531200000,
"updatedAt": 1672617600000
}"""
val subscription = adapter.fromJson(json)
assertNotNull(subscription)
assertNull(subscription?.category)
assertNull(subscription?.error)
assertNull(subscription?.httpAuth)
}
@Test
fun testHttpAuthSerialization() {
val subscription = FeedSubscription(
id = "sub-1",
url = "https://example.com/feed.xml",
title = "Private Feed",
enabled = true,
fetchInterval = 60,
createdAt = Date(1672531200000),
updatedAt = Date(1672617600000),
httpAuth = HttpAuth(
username = "user123",
password = "pass456"
)
)
val json = adapter.toJson(subscription)
assertNotNull(json)
}
@Test
fun testHttpAuthDeserialization() {
val json = """{
"id": "sub-1",
"url": "https://example.com/feed.xml",
"title": "Private Feed",
"enabled": true,
"fetchInterval": 60,
"createdAt": 1672531200000,
"updatedAt": 1672617600000,
"httpAuth": {
"username": "user123",
"password": "pass456"
}
}"""
val subscription = adapter.fromJson(json)
assertNotNull(subscription)
assertNotNull(subscription?.httpAuth)
assertEquals("user123", subscription?.httpAuth?.username)
assertEquals("pass456", subscription?.httpAuth?.password)
}
@Test
fun testCopy() {
val original = FeedSubscription(
id = "sub-1",
url = "https://example.com/feed.xml",
title = "Original Title",
enabled = true,
fetchInterval = 60,
createdAt = Date(1672531200000),
updatedAt = Date(1672617600000)
)
val modified = original.copy(title = "Modified Title", enabled = false)
assertEquals("sub-1", modified.id)
assertEquals("Modified Title", modified.title)
assertEquals(false, modified.enabled)
assertEquals(60, modified.fetchInterval)
}
@Test
fun testEqualsAndHashCode() {
val sub1 = FeedSubscription(
id = "sub-1",
url = "https://example.com",
title = "Test",
enabled = true,
fetchInterval = 60,
createdAt = Date(1672531200000),
updatedAt = Date(1672617600000)
)
val sub2 = FeedSubscription(
id = "sub-1",
url = "https://example.com",
title = "Test",
enabled = true,
fetchInterval = 60,
createdAt = Date(1672531200000),
updatedAt = Date(1672617600000)
)
val sub3 = FeedSubscription(
id = "sub-2",
url = "https://example.com",
title = "Test",
enabled = true,
fetchInterval = 60,
createdAt = Date(1672531200000),
updatedAt = Date(1672617600000)
)
assertEquals(sub1, sub2)
assertEquals(sub1.hashCode(), sub2.hashCode())
assert(sub1 != sub3)
}
@Test
fun testToString() {
val subscription = FeedSubscription(
id = "sub-1",
url = "https://example.com/feed.xml",
title = "Tech News",
enabled = true,
fetchInterval = 60,
createdAt = Date(1672531200000),
updatedAt = Date(1672617600000)
)
val toString = subscription.toString()
assertNotNull(toString)
assert(toString.contains("id=sub-1"))
assert(toString.contains("title=Tech News"))
}
}

View File

@@ -1,139 +0,0 @@
package com.rssuper.models
import com.squareup.moshi.Moshi
import com.squareup.moshi.kotlin.reflect.KotlinJsonAdapterFactory
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Assert.assertTrue
import org.junit.Before
import org.junit.Test
import java.util.Date
class FeedTest {
private lateinit var moshi: Moshi
private lateinit var adapter: com.squareup.moshi.JsonAdapter<Feed>
@Before
fun setup() {
moshi = Moshi.Builder()
.add(KotlinJsonAdapterFactory())
.build()
adapter = moshi.adapter(Feed::class.java)
}
@Test
fun testSerialization() {
val feed = Feed(
id = "feed-1",
title = "Tech News",
link = "https://example.com",
description = "Technology news feed",
subtitle = "Daily tech updates",
language = "en",
rawUrl = "https://example.com/feed.xml",
ttl = 60,
items = listOf(
FeedItem(id = "item-1", title = "Article 1"),
FeedItem(id = "item-2", title = "Article 2")
)
)
val json = adapter.toJson(feed)
assertNotNull(json)
}
@Test
fun testDeserialization() {
val json = """{
"id": "feed-1",
"title": "Tech News",
"link": "https://example.com",
"description": "Technology news feed",
"subtitle": "Daily tech updates",
"language": "en",
"rawUrl": "https://example.com/feed.xml",
"ttl": 60,
"items": [
{"id": "item-1", "title": "Article 1"},
{"id": "item-2", "title": "Article 2"}
]
}"""
val feed = adapter.fromJson(json)
assertNotNull(feed)
assertEquals("feed-1", feed?.id)
assertEquals("Tech News", feed?.title)
assertEquals(2, feed?.items?.size)
}
@Test
fun testOptionalFieldsNull() {
val json = """{
"id": "feed-1",
"title": "Tech News",
"rawUrl": "https://example.com/feed.xml"
}"""
val feed = adapter.fromJson(json)
assertNotNull(feed)
assertNull(feed?.link)
assertNull(feed?.description)
assertNull(feed?.language)
}
@Test
fun testEmptyItemsList() {
val json = """{
"id": "feed-1",
"title": "Tech News",
"rawUrl": "https://example.com/feed.xml",
"items": []
}"""
val feed = adapter.fromJson(json)
assertNotNull(feed)
assertTrue(feed?.items?.isEmpty() == true)
}
@Test
fun testCopy() {
val original = Feed(
id = "feed-1",
title = "Original Title",
rawUrl = "https://example.com/feed.xml"
)
val modified = original.copy(title = "Modified Title")
assertEquals("feed-1", modified.id)
assertEquals("Modified Title", modified.title)
assertEquals("https://example.com/feed.xml", modified.rawUrl)
}
@Test
fun testEqualsAndHashCode() {
val feed1 = Feed(id = "feed-1", title = "Test", rawUrl = "https://example.com")
val feed2 = Feed(id = "feed-1", title = "Test", rawUrl = "https://example.com")
val feed3 = Feed(id = "feed-2", title = "Test", rawUrl = "https://example.com")
assertEquals(feed1, feed2)
assertEquals(feed1.hashCode(), feed2.hashCode())
assert(feed1 != feed3)
}
@Test
fun testToString() {
val feed = Feed(
id = "feed-1",
title = "Tech News",
rawUrl = "https://example.com/feed.xml"
)
val toString = feed.toString()
assertNotNull(toString)
assert(toString.contains("id=feed-1"))
assert(toString.contains("title=Tech News"))
}
}

View File

@@ -1,108 +0,0 @@
package com.rssuper.models
import com.squareup.moshi.Moshi
import com.squareup.moshi.kotlin.reflect.KotlinJsonAdapterFactory
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Before
import org.junit.Test
class NotificationPreferencesTest {
private lateinit var moshi: Moshi
private lateinit var adapter: com.squareup.moshi.JsonAdapter<NotificationPreferences>
@Before
fun setup() {
moshi = Moshi.Builder()
.add(KotlinJsonAdapterFactory())
.build()
adapter = moshi.adapter(NotificationPreferences::class.java)
}
@Test
fun testSerialization() {
val preferences = NotificationPreferences(
newArticles = true,
episodeReleases = true,
customAlerts = false,
badgeCount = true,
sound = true,
vibration = false
)
val json = adapter.toJson(preferences)
assertNotNull(json)
}
@Test
fun testDeserialization() {
val json = """{
"newArticles": true,
"episodeReleases": true,
"customAlerts": false,
"badgeCount": true,
"sound": true,
"vibration": false
}"""
val preferences = adapter.fromJson(json)
assertNotNull(preferences)
assertEquals(true, preferences?.newArticles)
assertEquals(true, preferences?.episodeReleases)
assertEquals(false, preferences?.customAlerts)
assertEquals(true, preferences?.badgeCount)
assertEquals(true, preferences?.sound)
assertEquals(false, preferences?.vibration)
}
@Test
fun testDefaultValues() {
val preferences = NotificationPreferences()
assertEquals(true, preferences.newArticles)
assertEquals(true, preferences.episodeReleases)
assertEquals(false, preferences.customAlerts)
assertEquals(true, preferences.badgeCount)
assertEquals(true, preferences.sound)
assertEquals(true, preferences.vibration)
}
@Test
fun testCopy() {
val original = NotificationPreferences(
newArticles = true,
sound = true
)
val modified = original.copy(newArticles = false, sound = false)
assertEquals(false, modified.newArticles)
assertEquals(false, modified.sound)
assertEquals(true, modified.episodeReleases)
}
@Test
fun testEqualsAndHashCode() {
val pref1 = NotificationPreferences(newArticles = true, sound = true)
val pref2 = NotificationPreferences(newArticles = true, sound = true)
val pref3 = NotificationPreferences(newArticles = false, sound = true)
assertEquals(pref1, pref2)
assertEquals(pref1.hashCode(), pref2.hashCode())
assert(pref1 != pref3)
}
@Test
fun testToString() {
val preferences = NotificationPreferences(
newArticles = true,
sound = true
)
val toString = preferences.toString()
assertNotNull(toString)
assert(toString.contains("newArticles"))
assert(toString.contains("sound"))
}
}

View File

@@ -1,141 +0,0 @@
package com.rssuper.models
import com.squareup.moshi.Moshi
import com.squareup.moshi.kotlin.reflect.KotlinJsonAdapterFactory
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Before
import org.junit.Test
class ReadingPreferencesTest {
private lateinit var moshi: Moshi
private lateinit var adapter: com.squareup.moshi.JsonAdapter<ReadingPreferences>
@Before
fun setup() {
moshi = Moshi.Builder()
.add(KotlinJsonAdapterFactory())
.build()
adapter = moshi.adapter(ReadingPreferences::class.java)
}
@Test
fun testSerialization() {
val preferences = ReadingPreferences(
fontSize = FontSize.LARGE,
lineHeight = LineHeight.RELAXED,
showTableOfContents = true,
showReadingTime = true,
showAuthor = false,
showDate = true
)
val json = adapter.toJson(preferences)
assertNotNull(json)
}
@Test
fun testDeserialization() {
val json = """{
"fontSize": "large",
"lineHeight": "relaxed",
"showTableOfContents": true,
"showReadingTime": true,
"showAuthor": false,
"showDate": true
}"""
val preferences = adapter.fromJson(json)
assertNotNull(preferences)
assertEquals(FontSize.LARGE, preferences?.fontSize)
assertEquals(LineHeight.RELAXED, preferences?.lineHeight)
assertEquals(true, preferences?.showTableOfContents)
assertEquals(true, preferences?.showReadingTime)
assertEquals(false, preferences?.showAuthor)
assertEquals(true, preferences?.showDate)
}
@Test
fun testFontSizeOptions() {
val fontSizes = listOf(
"small" to FontSize.SMALL,
"medium" to FontSize.MEDIUM,
"large" to FontSize.LARGE,
"xlarge" to FontSize.XLARGE
)
for ((jsonValue, expectedEnum) in fontSizes) {
val json = """{"fontSize": "$jsonValue"}"""
val preferences = adapter.fromJson(json)
assertNotNull("Failed for fontSize: $jsonValue", preferences)
assertEquals("Failed for fontSize: $jsonValue", expectedEnum, preferences?.fontSize)
}
}
@Test
fun testLineHeightOptions() {
val lineHeights = listOf(
"normal" to LineHeight.NORMAL,
"relaxed" to LineHeight.RELAXED,
"loose" to LineHeight.LOOSE
)
for ((jsonValue, expectedEnum) in lineHeights) {
val json = """{"lineHeight": "$jsonValue"}"""
val preferences = adapter.fromJson(json)
assertNotNull("Failed for lineHeight: $jsonValue", preferences)
assertEquals("Failed for lineHeight: $jsonValue", expectedEnum, preferences?.lineHeight)
}
}
@Test
fun testDefaultValues() {
val preferences = ReadingPreferences()
assertEquals(FontSize.MEDIUM, preferences.fontSize)
assertEquals(LineHeight.NORMAL, preferences.lineHeight)
assertEquals(false, preferences.showTableOfContents)
assertEquals(true, preferences.showReadingTime)
assertEquals(true, preferences.showAuthor)
assertEquals(true, preferences.showDate)
}
@Test
fun testCopy() {
val original = ReadingPreferences(
fontSize = FontSize.MEDIUM,
showReadingTime = true
)
val modified = original.copy(fontSize = FontSize.XLARGE, showReadingTime = false)
assertEquals(FontSize.XLARGE, modified.fontSize)
assertEquals(false, modified.showReadingTime)
assertEquals(LineHeight.NORMAL, modified.lineHeight)
}
@Test
fun testEqualsAndHashCode() {
val pref1 = ReadingPreferences(fontSize = FontSize.MEDIUM, showReadingTime = true)
val pref2 = ReadingPreferences(fontSize = FontSize.MEDIUM, showReadingTime = true)
val pref3 = ReadingPreferences(fontSize = FontSize.LARGE, showReadingTime = true)
assertEquals(pref1, pref2)
assertEquals(pref1.hashCode(), pref2.hashCode())
assert(pref1 != pref3)
}
@Test
fun testToString() {
val preferences = ReadingPreferences(
fontSize = FontSize.LARGE,
showReadingTime = true
)
val toString = preferences.toString()
assertNotNull(toString)
assert(toString.contains("fontSize"))
assert(toString.contains("showReadingTime"))
}
}

View File

@@ -1,156 +0,0 @@
package com.rssuper.models
import com.squareup.moshi.Moshi
import com.squareup.moshi.kotlin.reflect.KotlinJsonAdapterFactory
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Test
import java.util.Date
class SearchFiltersTest {
private lateinit var moshi: Moshi
private lateinit var adapter: com.squareup.moshi.JsonAdapter<SearchFilters>
@Before
fun setup() {
moshi = Moshi.Builder()
.add(KotlinJsonAdapterFactory())
.build()
adapter = moshi.adapter(SearchFilters::class.java)
}
@Test
fun testSerialization() {
val filters = SearchFilters(
dateFrom = Date(1672531200000),
dateTo = Date(1672617600000),
feedIds = listOf("feed-1", "feed-2"),
authors = listOf("John Doe", "Jane Smith"),
contentType = ContentType.ARTICLE,
sortOption = SearchSortOption.DATE_DESC
)
val json = adapter.toJson(filters)
assertNotNull(json)
}
@Test
fun testDeserialization() {
val json = """{
"dateFrom": 1672531200000,
"dateTo": 1672617600000,
"feedIds": ["feed-1", "feed-2"],
"authors": ["John Doe", "Jane Smith"],
"contentType": "article",
"sortOption": "date_desc"
}"""
val filters = adapter.fromJson(json)
assertNotNull(filters)
assertNotNull(filters?.dateFrom)
assertNotNull(filters?.dateTo)
assertEquals(2, filters?.feedIds?.size)
assertEquals(2, filters?.authors?.size)
assertEquals(ContentType.ARTICLE, filters?.contentType)
assertEquals(SearchSortOption.DATE_DESC, filters?.sortOption)
}
@Test
fun testContentTypeAudio() {
val json = """{
"contentType": "audio"
}"""
val filters = adapter.fromJson(json)
assertNotNull(filters)
assertEquals(ContentType.AUDIO, filters?.contentType)
}
@Test
fun testContentTypeVideo() {
val json = """{
"contentType": "video"
}"""
val filters = adapter.fromJson(json)
assertNotNull(filters)
assertEquals(ContentType.VIDEO, filters?.contentType)
}
@Test
fun testSortOptions() {
val sortOptions = listOf(
"relevance" to SearchSortOption.RELEVANCE,
"date_desc" to SearchSortOption.DATE_DESC,
"date_asc" to SearchSortOption.DATE_ASC,
"title_asc" to SearchSortOption.TITLE_ASC,
"title_desc" to SearchSortOption.TITLE_DESC,
"feed_asc" to SearchSortOption.FEED_ASC,
"feed_desc" to SearchSortOption.FEED_DESC
)
for ((jsonValue, expectedEnum) in sortOptions) {
val json = """{"sortOption": "$jsonValue"}"""
val filters = adapter.fromJson(json)
assertNotNull("Failed for sortOption: $jsonValue", filters)
assertEquals("Failed for sortOption: $jsonValue", expectedEnum, filters?.sortOption)
}
}
@Test
fun testOptionalFieldsNull() {
val json = "{}"
val filters = adapter.fromJson(json)
assertNotNull(filters)
assertNull(filters?.dateFrom)
assertNull(filters?.dateTo)
assertNull(filters?.feedIds)
assertNull(filters?.authors)
assertNull(filters?.contentType)
assertEquals(SearchSortOption.RELEVANCE, filters?.sortOption)
}
@Test
fun testCopy() {
val original = SearchFilters(
feedIds = listOf("feed-1"),
sortOption = SearchSortOption.RELEVANCE
)
val modified = original.copy(
feedIds = listOf("feed-1", "feed-2"),
sortOption = SearchSortOption.DATE_DESC
)
assertEquals(2, modified.feedIds?.size)
assertEquals(SearchSortOption.DATE_DESC, modified.sortOption)
}
@Test
fun testEqualsAndHashCode() {
val filters1 = SearchFilters(feedIds = listOf("feed-1"), sortOption = SearchSortOption.RELEVANCE)
val filters2 = SearchFilters(feedIds = listOf("feed-1"), sortOption = SearchSortOption.RELEVANCE)
val filters3 = SearchFilters(feedIds = listOf("feed-2"), sortOption = SearchSortOption.RELEVANCE)
assertEquals(filters1, filters2)
assertEquals(filters1.hashCode(), filters2.hashCode())
assert(filters1 != filters3)
}
@Test
fun testToString() {
val filters = SearchFilters(
feedIds = listOf("feed-1"),
sortOption = SearchSortOption.DATE_DESC
)
val toString = filters.toString()
assertNotNull(toString)
assert(toString.contains("feedIds"))
assert(toString.contains("sortOption"))
}
}

View File

@@ -1,153 +0,0 @@
package com.rssuper.models
import com.squareup.moshi.Moshi
import com.squareup.moshi.kotlin.reflect.KotlinJsonAdapterFactory
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Before
import org.junit.Test
import java.util.Date
class SearchResultTest {
private lateinit var moshi: Moshi
private lateinit var adapter: com.squareup.moshi.JsonAdapter<SearchResult>
@Before
fun setup() {
moshi = Moshi.Builder()
.add(KotlinJsonAdapterFactory())
.build()
adapter = moshi.adapter(SearchResult::class.java)
}
@Test
fun testArticleSerialization() {
val result = SearchResult(
id = "article-1",
type = SearchResultType.ARTICLE,
title = "Test Article",
snippet = "This is a snippet",
link = "https://example.com/article",
feedTitle = "Tech News",
published = Date(1672531200000),
score = 0.95
)
val json = adapter.toJson(result)
assertNotNull(json)
}
@Test
fun testFeedSerialization() {
val result = SearchResult(
id = "feed-1",
type = SearchResultType.FEED,
title = "Tech News Feed",
snippet = "Technology news and updates",
link = "https://example.com",
score = 0.85
)
val json = adapter.toJson(result)
assertNotNull(json)
}
@Test
fun testArticleDeserialization() {
val json = """{
"id": "article-1",
"type": "article",
"title": "Test Article",
"snippet": "This is a snippet",
"link": "https://example.com/article",
"feedTitle": "Tech News",
"published": 1672531200000,
"score": 0.95
}"""
val result = adapter.fromJson(json)
assertNotNull(result)
assertEquals("article-1", result?.id)
assertEquals(SearchResultType.ARTICLE, result?.type)
assertEquals("Test Article", result?.title)
assertEquals("This is a snippet", result?.snippet)
}
@Test
fun testFeedDeserialization() {
val json = """{
"id": "feed-1",
"type": "feed",
"title": "Tech News Feed",
"snippet": "Technology news and updates",
"link": "https://example.com",
"score": 0.85
}"""
val result = adapter.fromJson(json)
assertNotNull(result)
assertEquals("feed-1", result?.id)
assertEquals(SearchResultType.FEED, result?.type)
}
@Test
fun testOptionalFieldsNull() {
val json = """{
"id": "article-1",
"type": "article",
"title": "Test Article"
}"""
val result = adapter.fromJson(json)
assertNotNull(result)
assertNull(result?.snippet)
assertNull(result?.link)
assertNull(result?.feedTitle)
assertNull(result?.published)
assertNull(result?.score)
}
@Test
fun testCopy() {
val original = SearchResult(
id = "article-1",
type = SearchResultType.ARTICLE,
title = "Original Title"
)
val modified = original.copy(title = "Modified Title", score = 0.99)
assertEquals("article-1", modified.id)
assertEquals(SearchResultType.ARTICLE, modified.type)
assertEquals("Modified Title", modified.title)
assertEquals(0.99, modified.score!!, 0.001)
}
@Test
fun testEqualsAndHashCode() {
val result1 = SearchResult(id = "article-1", type = SearchResultType.ARTICLE, title = "Test")
val result2 = SearchResult(id = "article-1", type = SearchResultType.ARTICLE, title = "Test")
val result3 = SearchResult(id = "article-2", type = SearchResultType.ARTICLE, title = "Test")
assertEquals(result1, result2)
assertEquals(result1.hashCode(), result2.hashCode())
assert(result1 != result3)
}
@Test
fun testToString() {
val result = SearchResult(
id = "article-1",
type = SearchResultType.ARTICLE,
title = "Test Article",
score = 0.95
)
val toString = result.toString()
assertNotNull(toString)
assert(toString.contains("id=article-1"))
assert(toString.contains("title=Test Article"))
}
}

View File

@@ -1,245 +0,0 @@
package com.rssuper.parsing
import com.rssuper.models.Enclosure
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Test
import org.junit.runner.RunWith
import org.robolectric.RobolectricTestRunner
import org.robolectric.annotation.Config
@RunWith(RobolectricTestRunner::class)
@Config(sdk = [24])
class AtomParserTest {
@Test
fun testParseBasicAtom() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Atom Feed</title>
<subtitle>Feed subtitle</subtitle>
<link href="https://example.com" rel="alternate"/>
<id>urn:uuid:feed-id-123</id>
<updated>2024-01-01T12:00:00Z</updated>
<generator>Atom Generator</generator>
<entry>
<title>Entry 1</title>
<link href="https://example.com/entry1" rel="alternate"/>
<id>urn:uuid:entry-1</id>
<updated>2024-01-01T10:00:00Z</updated>
<summary>Summary of entry 1</summary>
</entry>
<entry>
<title>Entry 2</title>
<link href="https://example.com/entry2" rel="alternate"/>
<id>urn:uuid:entry-2</id>
<updated>2023-12-31T10:00:00Z</updated>
<content>Full content of entry 2</content>
</entry>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
assertEquals("Atom Feed", feed.title)
assertEquals("https://example.com", feed.link)
assertEquals("Feed subtitle", feed.subtitle)
assertEquals(2, feed.items.size)
val entry1 = feed.items[0]
assertEquals("Entry 1", entry1.title)
assertEquals("https://example.com/entry1", entry1.link)
assertEquals("Summary of entry 1", entry1.description)
assertNotNull(entry1.published)
val entry2 = feed.items[1]
assertEquals("Entry 2", entry2.title)
assertEquals("Full content of entry 2", entry2.content)
}
@Test
fun testParseAtomWithAuthor() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Author Feed</title>
<id>urn:uuid:feed-id</id>
<entry>
<title>Entry with Author</title>
<id>urn:uuid:entry</id>
<author>
<name>John Doe</name>
<email>john@example.com</email>
</author>
</entry>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
val entry = feed.items[0]
assertEquals("John Doe", entry.author)
}
@Test
fun testParseAtomWithCategories() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Categorized Feed</title>
<id>urn:uuid:feed-id</id>
<entry>
<title>Categorized Entry</title>
<id>urn:uuid:entry</id>
<category term="technology"/>
<category term="programming"/>
</entry>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
val entry = feed.items[0]
assertEquals(2, entry.categories?.size)
assertEquals("technology", entry.categories?.get(0))
assertEquals("programming", entry.categories?.get(1))
}
@Test
fun testParseAtomWithEnclosure() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Enclosure Feed</title>
<id>urn:uuid:feed-id</id>
<entry>
<title>Episode</title>
<id>urn:uuid:entry</id>
<link href="https://example.com/ep.mp3" rel="enclosure" type="audio/mpeg" length="12345678"/>
</entry>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
val entry = feed.items[0]
assertNotNull(entry.enclosure)
assertEquals("https://example.com/ep.mp3", entry.enclosure?.url)
assertEquals("audio/mpeg", entry.enclosure?.type)
assertEquals(12345678L, entry.enclosure?.length)
}
@Test
fun testParseAtomWithContent() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Content Feed</title>
<id>urn:uuid:feed-id</id>
<entry>
<title>Entry</title>
<id>urn:uuid:entry</id>
<summary>Short summary</summary>
<content>Full HTML content</content>
</entry>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
val entry = feed.items[0]
assertEquals("Full HTML content", entry.content)
assertEquals("Short summary", entry.description)
}
@Test
fun testParseAtomWithiTunesExtension() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd">
<title>Podcast</title>
<id>urn:uuid:feed-id</id>
<entry>
<title>Episode</title>
<id>urn:uuid:entry</id>
<itunes:duration>3600</itunes:duration>
<itunes:summary>Episode summary</itunes:summary>
</entry>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
val entry = feed.items[0]
assertEquals("Episode summary", entry.description)
}
@Test
fun testParseAtomWithPublished() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Date Feed</title>
<id>urn:uuid:feed-id</id>
<updated>2024-06-15T12:00:00Z</updated>
<entry>
<title>Entry</title>
<id>urn:uuid:entry</id>
<published>2024-01-01T08:00:00Z</published>
<updated>2024-01-02T10:00:00Z</updated>
</entry>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
val entry = feed.items[0]
assertNotNull(entry.published)
}
@Test
fun testParseAtomWithEmptyFeed() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Empty Feed</title>
<id>urn:uuid:feed-id</id>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
assertEquals("Empty Feed", feed.title)
assertEquals(0, feed.items.size)
}
@Test
fun testParseAtomWithMissingFields() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<entry>
<title>Minimal Entry</title>
</entry>
</feed>
""".trimIndent()
val feed = AtomParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(feed)
assertEquals("Untitled Feed", feed.title)
assertEquals(1, feed.items.size)
assertEquals("Minimal Entry", feed.items[0].title)
assertNull(feed.items[0].link)
}
}

View File

@@ -1,162 +0,0 @@
package com.rssuper.parsing
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.fail
import org.junit.Test
import org.junit.runner.RunWith
import org.robolectric.RobolectricTestRunner
import org.robolectric.annotation.Config
@RunWith(RobolectricTestRunner::class)
@Config(sdk = [24])
class FeedParserTest {
@Test
fun testParseRSSFeed() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title>RSS Feed</title>
<link>https://example.com</link>
<item>
<title>Item</title>
<link>https://example.com/item</link>
</item>
</channel>
</rss>
""".trimIndent()
val result = FeedParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(result)
assertEquals(FeedType.RSS, result.feedType)
assertEquals("RSS Feed", result.feed.title)
}
@Test
fun testParseAtomFeed() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Atom Feed</title>
<id>urn:uuid:feed</id>
<entry>
<title>Entry</title>
<id>urn:uuid:entry</id>
</entry>
</feed>
""".trimIndent()
val result = FeedParser.parse(xml, "https://example.com/feed.atom")
assertNotNull(result)
assertEquals(FeedType.Atom, result.feedType)
assertEquals("Atom Feed", result.feed.title)
}
@Test
fun testParseRSSWithNamespaces() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd">
<channel>
<title>Namespaced Feed</title>
<atom:link href="https://example.com/feed.xml" rel="self"/>
<itunes:author>Author</itunes:author>
<item>
<title>Item</title>
</item>
</channel>
</rss>
""".trimIndent()
val result = FeedParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(result)
assertEquals(FeedType.RSS, result.feedType)
}
@Test
fun testParseMalformedXml() {
val malformedXml = """
<?xml version="1.0"?>
<rss>
<channel>
<title>Broken
""".trimIndent()
try {
val result = FeedParser.parse(malformedXml, "https://example.com/feed.xml")
assertNotNull(result)
} catch (e: Exception) {
assertNotNull(e)
}
}
@Test
fun testParseInvalidFeedType() {
val invalidXml = """
<?xml version="1.0" encoding="UTF-8"?>
<invalid>
<data>Some data</data>
</invalid>
""".trimIndent()
try {
FeedParser.parse(invalidXml, "https://example.com/feed.xml")
fail("Expected exception for invalid feed type")
} catch (e: FeedParsingError) {
assertEquals(FeedParsingError.UnsupportedFeedType, e)
}
}
@Test
fun testParseEmptyFeed() {
val emptyXml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title></title>
</channel>
</rss>
""".trimIndent()
val result = FeedParser.parse(emptyXml, "https://example.com/feed.xml")
assertNotNull(result)
assertEquals("Untitled Feed", result.feed.title)
}
@Test
fun testAsyncCallback() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title>Async Feed</title>
<item>
<title>Item</title>
</item>
</channel>
</rss>
""".trimIndent()
FeedParser.parseAsync(xml, "https://example.com/feed.xml") { result ->
assert(result.isSuccess)
val feed = result.getOrNull()
assertNotNull(feed)
assertEquals("Async Feed", feed?.feed?.title)
}
}
@Test
fun testAsyncCallbackError() {
val invalidXml = "not xml"
FeedParser.parseAsync(invalidXml, "https://example.com/feed.xml") { result ->
assert(result.isFailure)
}
}
}

View File

@@ -1,255 +0,0 @@
package com.rssuper.parsing
import com.rssuper.models.Enclosure
import com.rssuper.models.Feed
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertNull
import org.junit.Test
import org.junit.runner.RunWith
import org.robolectric.RobolectricTestRunner
import org.robolectric.annotation.Config
@RunWith(RobolectricTestRunner::class)
@Config(sdk = [24])
class RSSParserTest {
@Test
fun testParseBasicRSS() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title>Test Feed</title>
<link>https://example.com</link>
<description>A test feed</description>
<language>en-us</language>
<lastBuildDate>Mon, 01 Jan 2024 12:00:00 GMT</lastBuildDate>
<generator>RSS Generator</generator>
<ttl>60</ttl>
<item>
<title>Item 1</title>
<link>https://example.com/item1</link>
<description>Description of item 1</description>
<guid isPermaLink="true">https://example.com/item1</guid>
<pubDate>Mon, 01 Jan 2024 10:00:00 GMT</pubDate>
</item>
<item>
<title>Item 2</title>
<link>https://example.com/item2</link>
<description>Description of item 2</description>
<guid>item-2-guid</guid>
<pubDate>Sun, 31 Dec 2023 10:00:00 GMT</pubDate>
</item>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
assertEquals("Test Feed", feed.title)
assertEquals("https://example.com", feed.link)
assertEquals("A test feed", feed.description)
assertEquals("en-us", feed.language)
assertEquals(60, feed.ttl)
assertEquals(2, feed.items.size)
val item1 = feed.items[0]
assertEquals("Item 1", item1.title)
assertEquals("https://example.com/item1", item1.link)
assertEquals("Description of item 1", item1.description)
assertNotNull(item1.published)
}
@Test
fun testParseRSSWithiTunesNamespace() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd">
<channel>
<title>Podcast Feed</title>
<link>https://example.com/podcast</link>
<description>My podcast</description>
<itunes:subtitle>Podcast subtitle</itunes:subtitle>
<itunes:author>Author Name</itunes:author>
<item>
<title>Episode 1</title>
<link>https://example.com/episode1</link>
<description>Episode description</description>
<itunes:duration>01:30:00</itunes:duration>
<enclosure url="https://example.com/ep1.mp3" type="audio/mpeg" length="12345678"/>
</item>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
assertEquals("Podcast Feed", feed.title)
val item = feed.items[0]
assertEquals("Episode 1", item.title)
assertNotNull(item.enclosure)
assertEquals("https://example.com/ep1.mp3", item.enclosure?.url)
assertEquals("audio/mpeg", item.enclosure?.type)
assertEquals(12345678L, item.enclosure?.length)
}
@Test
fun testParseRSSWithContentNamespace() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/">
<channel>
<title>Feed with Content</title>
<item>
<title>Item with Content</title>
<description>Short description</description>
<content:encoded><![CDATA[<p>Full content here</p>]]></content:encoded>
</item>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
assertEquals(1, feed.items.size)
assertEquals("Item with Content", feed.items[0].title)
assertEquals("<p>Full content here</p>", feed.items[0].content)
}
@Test
fun testParseRSSWithCategories() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title>Categorized Feed</title>
<item>
<title>Tech Article</title>
<category>Technology</category>
<category>Programming</category>
</item>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
val item = feed.items[0]
assertEquals(2, item.categories?.size)
assertEquals("Technology", item.categories?.get(0))
assertEquals("Programming", item.categories?.get(1))
}
@Test
fun testParseRSSWithAuthor() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title>Author Feed</title>
<item>
<title>Article by Author</title>
<author>author@example.com (John Doe)</author>
</item>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
val item = feed.items[0]
assertEquals("author@example.com (John Doe)", item.author)
}
@Test
fun testParseRSSWithGuid() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title>Guid Feed</title>
<item>
<title>Item</title>
<guid>custom-guid-12345</guid>
</item>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
assertEquals("custom-guid-12345", feed.items[0].guid)
}
@Test
fun testParseRSSWithEmptyChannel() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title>Minimal Feed</title>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
assertEquals("Minimal Feed", feed.title)
assertEquals(0, feed.items.size)
}
@Test
fun testParseRSSWithMissingFields() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<item>
<title>Only Title</title>
</item>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
assertEquals("Untitled Feed", feed.title)
assertEquals(1, feed.items.size)
assertEquals("Only Title", feed.items[0].title)
assertNull(feed.items[0].link)
}
@Test
fun testParseRSSWithCDATA() {
val xml = """
<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0">
<channel>
<title><![CDATA[CDATA Title]]></title>
<description><![CDATA[<p>HTML <strong>content</strong></p>]]></description>
<item>
<title>CDATA Item</title>
<description><![CDATA[Item content]]></description>
</item>
</channel>
</rss>
""".trimIndent()
val feed = RSSParser.parse(xml, "https://example.com/feed.xml")
assertNotNull(feed)
assertEquals("CDATA Title", feed.title)
assertEquals("<p>HTML <strong>content</strong></p>", feed.description)
assertEquals("Item content", feed.items[0].description)
}
}

View File

@@ -1,70 +0,0 @@
package com.rssuper.repository
import com.rssuper.database.daos.FeedItemDao
import com.rssuper.database.entities.FeedItemEntity
import com.rssuper.state.State
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.test.runTest
import org.junit.Before
import org.junit.Test
import org.mockito.Mockito
import org.mockito.Mockito.`when`
class FeedRepositoryTest {
private lateinit var feedItemDao: FeedItemDao
private lateinit var feedRepository: FeedRepository
@Before
fun setup() {
feedItemDao = Mockito.mock(FeedItemDao::class.java)
feedRepository = FeedRepositoryImpl(feedItemDao)
}
@Test
fun testGetFeedItemsSuccess() = runTest {
val items = listOf(
FeedItemEntity(
id = "1",
subscriptionId = "sub1",
title = "Test Item",
published = java.util.Date()
)
)
val stateFlow = MutableStateFlow<State<List<FeedItemEntity>>>(State.Success(items))
`when`(feedItemDao.getItemsBySubscription("sub1")).thenReturn(stateFlow)
feedRepository.getFeedItems("sub1").collect { state ->
assert(state is State.Success)
assert((state as State.Success).data == items)
}
}
@Test
fun testInsertFeedItemSuccess() = runTest {
val item = FeedItemEntity(
id = "1",
subscriptionId = "sub1",
title = "Test Item",
published = java.util.Date()
)
`when`(feedItemDao.insertItem(item)).thenReturn(1L)
val result = feedRepository.insertFeedItem(item)
assert(result == 1L)
}
@Test(expected = RuntimeException::class)
fun testInsertFeedItemError() = runTest {
`when`(feedItemDao.insertItem(Mockito.any())).thenThrow(RuntimeException("Database error"))
feedRepository.insertFeedItem(FeedItemEntity(
id = "1",
subscriptionId = "sub1",
title = "Test Item",
published = java.util.Date()
))
}
}

View File

@@ -1,108 +0,0 @@
package com.rssuper.repository
import com.rssuper.database.daos.SubscriptionDao
import com.rssuper.database.entities.SubscriptionEntity
import com.rssuper.state.State
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.test.runTest
import org.junit.Before
import org.junit.Test
import org.mockito.Mockito
import org.mockito.Mockito.`when`
import java.util.Date
class SubscriptionRepositoryTest {
private lateinit var subscriptionDao: SubscriptionDao
private lateinit var subscriptionRepository: SubscriptionRepository
@Before
fun setup() {
subscriptionDao = Mockito.mock(SubscriptionDao::class.java)
subscriptionRepository = SubscriptionRepositoryImpl(subscriptionDao)
}
@Test
fun testGetAllSubscriptionsSuccess() = runTest {
val subscriptions = listOf(
SubscriptionEntity(
id = "1",
url = "https://example.com/feed.xml",
title = "Test Feed",
createdAt = Date(),
updatedAt = Date()
)
)
val stateFlow = MutableStateFlow<State<List<SubscriptionEntity>>>(State.Success(subscriptions))
`when`(subscriptionDao.getAllSubscriptions()).thenReturn(stateFlow)
subscriptionRepository.getAllSubscriptions().collect { state ->
assert(state is State.Success)
assert((state as State.Success).data == subscriptions)
}
}
@Test
fun testGetEnabledSubscriptionsSuccess() = runTest {
val subscriptions = listOf(
SubscriptionEntity(
id = "1",
url = "https://example.com/feed.xml",
title = "Test Feed",
enabled = true,
createdAt = Date(),
updatedAt = Date()
)
)
val stateFlow = MutableStateFlow<State<List<SubscriptionEntity>>>(State.Success(subscriptions))
`when`(subscriptionDao.getEnabledSubscriptions()).thenReturn(stateFlow)
subscriptionRepository.getEnabledSubscriptions().collect { state ->
assert(state is State.Success)
assert((state as State.Success).data == subscriptions)
}
}
@Test
fun testInsertSubscriptionSuccess() = runTest {
val subscription = SubscriptionEntity(
id = "1",
url = "https://example.com/feed.xml",
title = "Test Feed",
createdAt = Date(),
updatedAt = Date()
)
`when`(subscriptionDao.insertSubscription(subscription)).thenReturn(1L)
val result = subscriptionRepository.insertSubscription(subscription)
assert(result == 1L)
}
@Test
fun testUpdateSubscriptionSuccess() = runTest {
val subscription = SubscriptionEntity(
id = "1",
url = "https://example.com/feed.xml",
title = "Test Feed",
enabled = true,
createdAt = Date(),
updatedAt = Date()
)
`when`(subscriptionDao.updateSubscription(subscription)).thenReturn(1)
val result = subscriptionRepository.updateSubscription(subscription)
assert(result == 1)
}
@Test
fun testSetEnabledSuccess() = runTest {
`when`(subscriptionDao.setEnabled("1", true)).thenReturn(1)
val result = subscriptionRepository.setEnabled("1", true)
assert(result == 1)
}
}

View File

@@ -1,106 +0,0 @@
package com.rssuper.services
import org.junit.Assert.assertTrue
import org.junit.Test
class FeedFetcherIntegrationTest {
@Test
fun testFetchRealFeed() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val result = feedFetcher.fetch("https://example.com/feed.xml")
assertTrue(result.isSuccess() || result.isFailure())
}
@Test
fun testFetchAndParseRealFeed() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val result = feedFetcher.fetchAndParse("https://example.com/feed.xml")
assertTrue(result.isSuccess() || result.isFailure())
}
@Test
fun testFetchWithHTTPAuthCredentials() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val auth = HTTPAuthCredentials("testuser", "testpass")
val credentials = auth.toCredentials()
assertTrue(credentials.startsWith("Basic "))
}
@Test
fun testFetchWithCacheControl() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val result = feedFetcher.fetch("https://example.com/feed.xml")
assertTrue(result.isSuccess() || result.isFailure())
}
@Test
fun testFetchPerformance() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val startTime = System.currentTimeMillis()
val result = feedFetcher.fetch("https://example.com/feed.xml")
val duration = System.currentTimeMillis() - startTime
assertTrue(duration < 20000 || result.isFailure())
}
@Test
fun testFetchWithIfNoneMatch() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val etag = "test-etag-value"
val result = feedFetcher.fetch("https://example.com/feed.xml", ifNoneMatch = etag)
assertTrue(result.isSuccess() || result.isFailure())
}
@Test
fun testFetchWithIfModifiedSince() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val lastModified = "Mon, 01 Jan 2024 00:00:00 GMT"
val result = feedFetcher.fetch("https://example.com/feed.xml", ifModifiedSince = lastModified)
assertTrue(result.isSuccess() || result.isFailure())
}
@Test
fun testFetchMultipleFeeds() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val urls = listOf(
"https://example.com/feed1.xml",
"https://example.com/feed2.xml"
)
for (url in urls) {
val result = feedFetcher.fetch(url)
assertTrue(result.isSuccess() || result.isFailure())
}
}
@Test
fun testFetchWithDifferentTimeouts() {
val shortTimeoutFetcher = FeedFetcher(timeoutMs = 1000)
val longTimeoutFetcher = FeedFetcher(timeoutMs = 30000)
val shortClientField = FeedFetcher::class.java.getDeclaredField("client")
shortClientField.isAccessible = true
val shortClient = shortClientField.get(shortTimeoutFetcher) as okhttp3.OkHttpClient
val longClientField = FeedFetcher::class.java.getDeclaredField("client")
longClientField.isAccessible = true
val longClient = longClientField.get(longTimeoutFetcher) as okhttp3.OkHttpClient
assertTrue(shortClient.connectTimeoutMillis < longClient.connectTimeoutMillis)
}
}

View File

@@ -1,57 +0,0 @@
package com.rssuper.services
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertTrue
import org.junit.Test
class FeedFetcherTest {
@Test
fun testOkHttpConfiguration() {
val feedFetcher = FeedFetcher(timeoutMs = 5000)
val clientField = FeedFetcher::class.java.getDeclaredField("client")
clientField.isAccessible = true
val okHttpClient = clientField.get(feedFetcher) as okhttp3.OkHttpClient
assertEquals(5000, okHttpClient.connectTimeoutMillis)
assertEquals(5000, okHttpClient.readTimeoutMillis)
assertEquals(5000, okHttpClient.writeTimeoutMillis)
assertNotNull(okHttpClient.eventListenerFactory)
}
@Test
fun testFetchWithHTTPAuth() {
val auth = HTTPAuthCredentials("user", "pass")
val credentials = auth.toCredentials()
assertNotNull(credentials)
assertTrue(credentials.startsWith("Basic "))
}
@Test
fun testFetchWithETag() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val etag = "test-etag-123"
val result = feedFetcher.fetch("https://example.com/feed.xml", ifNoneMatch = etag)
assertTrue(result.isSuccess() || result.isFailure())
}
@Test
fun testFetchWithLastModified() {
val feedFetcher = FeedFetcher(timeoutMs = 15000)
val lastModified = "Mon, 01 Jan 2024 00:00:00 GMT"
val result = feedFetcher.fetch("https://example.com/feed.xml", ifModifiedSince = lastModified)
assertTrue(result.isSuccess() || result.isFailure())
}
@Test
fun testFetchRetrySuccess() {
val feedFetcher = FeedFetcher(timeoutMs = 15000, maxRetries = 3)
val result = feedFetcher.fetch("https://example.com/feed.xml")
assertTrue(result.isSuccess() || result.isFailure())
}
}

View File

@@ -1,79 +0,0 @@
package com.rssuper.services
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertTrue
import org.junit.Test
class FetchResultTest {
@Test
fun testFetchResultCreation() {
val result = FetchResult(
feedXml = "<rss>test</rss>",
url = "https://example.com/feed.xml",
cacheControl = null,
isCached = false
)
assertEquals("<rss>test</rss>", result.feedXml)
assertEquals("https://example.com/feed.xml", result.url)
assertEquals(false, result.isCached)
assertEquals(null, result.cacheControl)
}
@Test
fun testFetchResultWithETag() {
val result = FetchResult(
feedXml = "<rss>test</rss>",
url = "https://example.com/feed.xml",
cacheControl = null,
isCached = false,
etag = "test-etag-123"
)
assertEquals("test-etag-123", result.etag)
}
@Test
fun testFetchResultWithLastModified() {
val result = FetchResult(
feedXml = "<rss>test</rss>",
url = "https://example.com/feed.xml",
cacheControl = null,
isCached = false,
lastModified = "Mon, 01 Jan 2024 00:00:00 GMT"
)
assertEquals("Mon, 01 Jan 2024 00:00:00 GMT", result.lastModified)
}
@Test
fun testFetchResultIsCached() {
val result = FetchResult(
feedXml = "<rss>test</rss>",
url = "https://example.com/feed.xml",
cacheControl = null,
isCached = true
)
assertTrue(result.isCached)
}
@Test
fun testFetchResultWithCacheControl() {
val cacheControl = okhttp3.CacheControl.Builder()
.noCache()
.build()
val result = FetchResult(
feedXml = "<rss>test</rss>",
url = "https://example.com/feed.xml",
cacheControl = cacheControl,
isCached = false
)
assertNotNull(result.cacheControl)
assertTrue(result.cacheControl!!.noCache)
}
}

View File

@@ -1,53 +0,0 @@
package com.rssuper.services
import org.junit.Assert.assertEquals
import org.junit.Assert.assertNotNull
import org.junit.Assert.assertTrue
import org.junit.Test
class HTTPAuthCredentialsTest {
@Test
fun testBasicAuthCredentials() {
val auth = HTTPAuthCredentials("username", "password")
val credentials = auth.toCredentials()
assertNotNull(credentials)
assertTrue(credentials.startsWith("Basic "))
}
@Test
fun testBasicAuthCredentialsWithSpecialChars() {
val auth = HTTPAuthCredentials("user@domain", "pass!@#")
val credentials = auth.toCredentials()
assertNotNull(credentials)
assertTrue(credentials.startsWith("Basic "))
}
@Test
fun testUsernameAndPassword() {
val auth = HTTPAuthCredentials("testuser", "testpass")
assertEquals("testuser", auth.username)
assertEquals("testpass", auth.password)
}
@Test
fun testEmptyUsername() {
val auth = HTTPAuthCredentials("", "password")
val credentials = auth.toCredentials()
assertNotNull(credentials)
assertTrue(credentials.startsWith("Basic "))
}
@Test
fun testEmptyPassword() {
val auth = HTTPAuthCredentials("username", "")
val credentials = auth.toCredentials()
assertNotNull(credentials)
assertTrue(credentials.startsWith("Basic "))
}
}

View File

@@ -1,66 +0,0 @@
package com.rssuper.state
import org.junit.Assert.assertEquals
import org.junit.Assert.assertFalse
import org.junit.Assert.assertTrue
import org.junit.Test
class StateTest {
@Test
fun testIdleState() {
val state: State<String> = State.Idle
assertTrue(state is State.Idle)
}
@Test
fun testLoadingState() {
val state: State<String> = State.Loading
assertTrue(state is State.Loading)
}
@Test
fun testSuccessState() {
val data = "test data"
val state: State<String> = State.Success(data)
assertTrue(state is State.Success)
assertEquals(data, (state as State.Success).data)
}
@Test
fun testErrorState() {
val message = "test error"
val state: State<String> = State.Error(message)
assertTrue(state is State.Error)
assertEquals(message, (state as State.Error).message)
assertEquals(null, (state as State.Error).cause)
}
@Test
fun testErrorStateWithCause() {
val message = "test error"
val cause = RuntimeException("cause")
val state: State<String> = State.Error(message, cause)
assertTrue(state is State.Error)
assertEquals(message, (state as State.Error).message)
assertEquals(cause, (state as State.Error).cause)
}
@Test
fun testErrorType() {
assertTrue(ErrorType.NETWORK != ErrorType.DATABASE)
assertTrue(ErrorType.PARSING != ErrorType.AUTH)
}
@Test
fun testErrorDetails() {
val details = ErrorDetails(ErrorType.NETWORK, "Network error", true)
assertEquals(ErrorType.NETWORK, details.type)
assertEquals("Network error", details.message)
assertTrue(details.retryable)
}
}

View File

@@ -1,73 +0,0 @@
package com.rssuper.viewmodel
import com.rssuper.repository.FeedRepository
import com.rssuper.state.State
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.test.runTest
import org.junit.Before
import org.junit.Test
import org.mockito.Mockito
import org.mockito.Mockito.`when`
class FeedViewModelTest {
private lateinit var feedRepository: FeedRepository
private lateinit var viewModel: FeedViewModel
@Before
fun setup() {
feedRepository = Mockito.mock(FeedRepository::class.java)
viewModel = FeedViewModel(feedRepository)
}
@Test
fun testInitialState() = runTest {
var stateEmitted = false
viewModel.feedState.collect { state ->
assert(state is State.Idle)
stateEmitted = true
}
assert(stateEmitted)
}
@Test
fun testLoadFeedItems() = runTest {
val items = listOf(
com.rssuper.database.entities.FeedItemEntity(
id = "1",
subscriptionId = "sub1",
title = "Test Item",
published = java.util.Date()
)
)
val stateFlow = MutableStateFlow<State<List<com.rssuper.database.entities.FeedItemEntity>>>(State.Success(items))
`when`(feedRepository.getFeedItems("sub1")).thenReturn(stateFlow)
viewModel.loadFeedItems("sub1")
var receivedState: State<List<com.rssuper.database.entities.FeedItemEntity>>? = null
viewModel.feedState.collect { state ->
receivedState = state
}
assert(receivedState is State.Success)
assert((receivedState as State.Success).data == items)
}
@Test
fun testMarkAsRead() = runTest {
`when`(feedRepository.markAsRead("1", true)).thenReturn(1)
`when`(feedRepository.getUnreadCount("sub1")).thenReturn(5)
viewModel.markAsRead("1", true)
var unreadCountState: State<Int>? = null
viewModel.unreadCount.collect { state ->
unreadCountState = state
}
assert(unreadCountState is State.Success)
assert((unreadCountState as State.Success).data == 5)
}
}

View File

@@ -1,100 +0,0 @@
package com.rssuper.viewmodel
import com.rssuper.repository.SubscriptionRepository
import com.rssuper.state.State
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.test.runTest
import org.junit.Before
import org.junit.Test
import org.mockito.Mockito
import org.mockito.Mockito.`when`
import java.util.Date
class SubscriptionViewModelTest {
private lateinit var subscriptionRepository: SubscriptionRepository
private lateinit var viewModel: SubscriptionViewModel
@Before
fun setup() {
subscriptionRepository = Mockito.mock(SubscriptionRepository::class.java)
viewModel = SubscriptionViewModel(subscriptionRepository)
}
@Test
fun testInitialState() = runTest {
var stateEmitted = false
viewModel.subscriptionsState.collect { state ->
assert(state is State.Idle)
stateEmitted = true
}
assert(stateEmitted)
}
@Test
fun testLoadAllSubscriptions() = runTest {
val subscriptions = listOf(
com.rssuper.database.entities.SubscriptionEntity(
id = "1",
url = "https://example.com/feed.xml",
title = "Test Feed",
createdAt = Date(),
updatedAt = Date()
)
)
val stateFlow = MutableStateFlow<State<List<com.rssuper.database.entities.SubscriptionEntity>>>(State.Success(subscriptions))
`when`(subscriptionRepository.getAllSubscriptions()).thenReturn(stateFlow)
viewModel.loadAllSubscriptions()
var receivedState: State<List<com.rssuper.database.entities.SubscriptionEntity>>? = null
viewModel.subscriptionsState.collect { state ->
receivedState = state
}
assert(receivedState is State.Success)
assert((receivedState as State.Success).data == subscriptions)
}
@Test
fun testSetEnabled() = runTest {
val subscriptions = listOf(
com.rssuper.database.entities.SubscriptionEntity(
id = "1",
url = "https://example.com/feed.xml",
title = "Test Feed",
enabled = true,
createdAt = Date(),
updatedAt = Date()
)
)
val stateFlow = MutableStateFlow<State<List<com.rssuper.database.entities.SubscriptionEntity>>>(State.Success(subscriptions))
`when`(subscriptionRepository.setEnabled("1", true)).thenReturn(1)
`when`(subscriptionRepository.getEnabledSubscriptions()).thenReturn(stateFlow)
viewModel.setEnabled("1", true)
var receivedState: State<List<com.rssuper.database.entities.SubscriptionEntity>>? = null
viewModel.enabledSubscriptionsState.collect { state ->
receivedState = state
}
assert(receivedState is State.Success)
assert((receivedState as State.Success).data == subscriptions)
}
@Test
fun testSetError() = runTest {
`when`(subscriptionRepository.setError("1", "Test error")).thenReturn(1)
viewModel.setError("1", "Test error")
var stateEmitted = false
viewModel.subscriptionsState.collect { state ->
stateEmitted = true
}
assert(stateEmitted)
}
}

Submodule native-route/ios/RSSuper deleted from 7916c92d76

View File

@@ -1 +0,0 @@
build

View File

@@ -1,74 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<gsettings schema="org.rssuper.notification.preferences">
<prefix>rssuper</prefix>
<binding>
<property name="newArticles" type="boolean"/>
</binding>
<binding>
<property name="episodeReleases" type="boolean"/>
</binding>
<binding>
<property name="customAlerts" type="boolean"/>
</binding>
<binding>
<property name="badgeCount" type="boolean"/>
</binding>
<binding>
<property name="sound" type="boolean"/>
</binding>
<binding>
<property name="vibration" type="boolean"/>
</binding>
<binding>
<property name="preferences" type="json"/>
</binding>
<keyvalue>
<key name="newArticles">New Article Notifications</key>
<default>true</default>
<description>Enable notifications for new articles</description>
</keyvalue>
<keyvalue>
<key name="episodeReleases">Episode Release Notifications</key>
<default>true</default>
<description>Enable notifications for episode releases</description>
</keyvalue>
<keyvalue>
<key name="customAlerts">Custom Alert Notifications</key>
<default>true</default>
<description>Enable notifications for custom alerts</description>
</keyvalue>
<keyvalue>
<key name="badgeCount">Badge Count</key>
<default>true</default>
<description>Show badge count in app header</description>
</keyvalue>
<keyvalue>
<key name="sound">Sound</key>
<default>true</default>
<description>Play sound on notification</description>
</keyvalue>
<keyvalue>
<key name="vibration">Vibration</key>
<default>true</default>
<description>Vibrate device on notification</description>
</keyvalue>
<keyvalue>
<key name="preferences">All Preferences</key>
<default>{
"newArticles": true,
"episodeReleases": true,
"customAlerts": true,
"badgeCount": true,
"sound": true,
"vibration": true
}</default>
<description>All notification preferences as JSON</description>
</keyvalue>
</gsettings>

View File

@@ -1,119 +0,0 @@
project('rssuper-linux', 'vala', 'c',
version: '0.1.0',
default_options: [
'c_std=c11',
'warning_level=3',
'werror=false',
]
)
vala = find_program('valac')
meson_version_check = run_command(vala, '--version', check: true)
# Dependencies
glib_dep = dependency('glib-2.0', version: '>= 2.58')
gio_dep = dependency('gio-2.0', version: '>= 2.58')
json_dep = dependency('json-glib-1.0', version: '>= 1.4')
sqlite_dep = dependency('sqlite3', version: '>= 3.0')
gobject_dep = dependency('gobject-2.0', version: '>= 2.58')
xml_dep = dependency('libxml-2.0', version: '>= 2.0')
soup_dep = dependency('libsoup-3.0', version: '>= 3.0')
# Source files
models = files(
'src/models/feed-item.vala',
'src/models/feed.vala',
'src/models/feed-subscription.vala',
'src/models/search-result.vala',
'src/models/search-filters.vala',
'src/models/notification-preferences.vala',
'src/models/reading-preferences.vala',
)
# Database files
database = files(
'src/database/db-error.vala',
'src/database/database.vala',
'src/database/subscription-store.vala',
'src/database/feed-item-store.vala',
'src/database/search-history-store.vala',
)
# Parser files
parser = files(
'src/parser/feed-type.vala',
'src/parser/parse-result.vala',
'src/parser/rss-parser.vala',
'src/parser/atom-parser.vala',
'src/parser/feed-parser.vala',
)
# Network files
network = files(
'src/network/network-error.vala',
'src/network/http-auth-credentials.vala',
'src/network/fetch-result.vala',
'src/network/feed-fetcher.vala',
)
# Main library
models_lib = library('rssuper-models', models,
dependencies: [glib_dep, gio_dep, json_dep],
install: false
)
# Database library
database_lib = library('rssuper-database', database,
dependencies: [glib_dep, gio_dep, json_dep, sqlite_dep, gobject_dep],
link_with: [models_lib],
install: false,
vala_args: ['--vapidir', 'src/database', '--pkg', 'sqlite3']
)
# Parser library
parser_lib = library('rssuper-parser', parser,
dependencies: [glib_dep, gio_dep, json_dep, xml_dep],
link_with: [models_lib],
install: false,
vala_args: ['--vapidir', 'src/parser', '--pkg', 'libxml-2.0']
)
# Network library
network_lib = library('rssuper-network', network,
dependencies: [glib_dep, gio_dep, json_dep, soup_dep],
link_with: [models_lib],
install: false,
vala_args: ['--vapidir', 'src/network', '--pkg', 'libsoup-3.0']
)
# Test executable
test_exe = executable('database-tests',
'src/tests/database-tests.vala',
dependencies: [glib_dep, gio_dep, json_dep, sqlite_dep, gobject_dep, xml_dep],
link_with: [models_lib, database_lib, parser_lib],
vala_args: ['--vapidir', '.', '--pkg', 'sqlite3', '--pkg', 'libxml-2.0'],
install: false
)
# Parser test executable
parser_test_exe = executable('parser-tests',
'src/tests/parser-tests.vala',
dependencies: [glib_dep, gio_dep, json_dep, xml_dep],
link_with: [models_lib, parser_lib],
vala_args: ['--vapidir', '.', '--pkg', 'libxml-2.0'],
install: false
)
# Feed fetcher test executable
fetcher_test_exe = executable('feed-fetcher-tests',
'src/tests/feed-fetcher-tests.vala',
dependencies: [glib_dep, gio_dep, json_dep, xml_dep, soup_dep],
link_with: [models_lib, parser_lib, network_lib],
vala_args: ['--vapidir', '.', '--pkg', 'libxml-2.0', '--pkg', 'libsoup-3.0'],
install: false
)
# Test definitions
test('database tests', test_exe)
test('parser tests', parser_test_exe)
test('feed fetcher tests', fetcher_test_exe)

View File

@@ -1,69 +0,0 @@
/*
* RSSuper Database vapi - exports SQLite bindings for use by dependent modules
*/
[CCode (cheader_filename = "sqlite3.h")]
namespace SQLite {
[CCode (cname = "sqlite3", free_function = "sqlite3_close")]
public class DB {
[CCode (cname = "sqlite3_open")]
public static int open(string filename, out DB db);
[CCode (cname = "sqlite3_close")]
public int close();
[CCode (cname = "sqlite3_exec")]
public int exec(string sql, DBCallback? callback = null, void* arg = null, [CCode (array_length = false)] out string? errmsg = null);
[CCode (cname = "sqlite3_errmsg")]
public unowned string errmsg();
[CCode (cname = "sqlite3_prepare_v2")]
public int prepare_v2(string zSql, int nByte, out Stmt stmt, void* pzTail = null);
}
[CCode (cname = "sqlite3_stmt", free_function = "sqlite3_finalize")]
public class Stmt {
[CCode (cname = "sqlite3_step")]
public int step();
[CCode (cname = "sqlite3_column_count")]
public int column_count();
[CCode (cname = "sqlite3_column_text")]
public unowned string column_text(int i);
[CCode (cname = "sqlite3_column_int")]
public int column_int(int i);
[CCode (cname = "sqlite3_column_double")]
public double column_double(int i);
[CCode (cname = "sqlite3_bind_text")]
public int bind_text(int i, string z, int n, void* x);
[CCode (cname = "sqlite3_bind_int")]
public int bind_int(int i, int val);
[CCode (cname = "sqlite3_bind_double")]
public int bind_double(int i, double val);
[CCode (cname = "sqlite3_bind_null")]
public int bind_null(int i);
[CCode (cname = "sqlite3_finalize")]
public int finalize();
}
[CCode (cname = "SQLITE_OK")]
public const int SQLITE_OK;
[CCode (cname = "SQLITE_ROW")]
public const int SQLITE_ROW;
[CCode (cname = "SQLITE_DONE")]
public const int SQLITE_DONE;
[CCode (cname = "SQLITE_ERROR")]
public const int SQLITE_ERROR;
[CCode (simple_type = true)]
public delegate int DBCallback(void* arg, int argc, string[] argv, string[] col_names);
}

View File

@@ -1,200 +0,0 @@
/*
* Database.vala
*
* Core database connection and migration management for RSSuper Linux.
* Uses SQLite with FTS5 for full-text search capabilities.
*/
/**
* Database - Manages SQLite database connection and migrations
*/
public class RSSuper.Database : Object {
private Sqlite.Database db;
private string db_path;
/**
* Current database schema version
*/
public const int CURRENT_VERSION = 1;
/**
* Signal emitted when database is ready
*/
public signal void ready();
/**
* Signal emitted on error
*/
public signal void error(string message);
/**
* Create a new database connection
*
* @param db_path Path to the SQLite database file
*/
public Database(string db_path) throws Error {
this.db_path = db_path;
this.open();
this.migrate();
}
/**
* Open database connection
*/
private void open() throws Error {
var file = File.new_for_path(db_path);
var parent = file.get_parent();
if (parent != null && !parent.query_exists()) {
try {
parent.make_directory_with_parents();
} catch (Error e) {
throw new DBError.FAILED("Failed to create database directory: %s", e.message);
}
}
int result = Sqlite.Database.open(db_path, out db);
if (result != Sqlite.OK) {
throw new DBError.FAILED("Failed to open database: %s".printf(db.errmsg()));
}
execute("PRAGMA foreign_keys = ON;");
execute("PRAGMA journal_mode = WAL;");
debug("Database opened: %s", db_path);
}
/**
* Run database migrations
*/
private void migrate() throws Error {
// Create schema_migrations table if not exists
execute("CREATE TABLE IF NOT EXISTS schema_migrations (version INTEGER PRIMARY KEY, applied_at TEXT NOT NULL DEFAULT (datetime('now')));");
// Create feed_subscriptions table
execute("CREATE TABLE IF NOT EXISTS feed_subscriptions (id TEXT PRIMARY KEY, url TEXT NOT NULL UNIQUE, title TEXT NOT NULL, category TEXT, enabled INTEGER NOT NULL DEFAULT 1, fetch_interval INTEGER NOT NULL DEFAULT 60, created_at TEXT NOT NULL, updated_at TEXT NOT NULL, last_fetched_at TEXT, next_fetch_at TEXT, error TEXT, http_auth_username TEXT, http_auth_password TEXT);");
// Create feed_items table
execute("CREATE TABLE IF NOT EXISTS feed_items (id TEXT PRIMARY KEY, subscription_id TEXT NOT NULL, title TEXT NOT NULL, link TEXT, description TEXT, content TEXT, author TEXT, published TEXT, updated TEXT, categories TEXT, enclosure_url TEXT, enclosure_type TEXT, enclosure_length TEXT, guid TEXT, is_read INTEGER NOT NULL DEFAULT 0, is_starred INTEGER NOT NULL DEFAULT 0, created_at TEXT NOT NULL DEFAULT (datetime('now')), FOREIGN KEY (subscription_id) REFERENCES feed_subscriptions(id) ON DELETE CASCADE);");
// Create indexes for feed_items
execute("CREATE INDEX IF NOT EXISTS idx_feed_items_subscription ON feed_items(subscription_id);");
execute("CREATE INDEX IF NOT EXISTS idx_feed_items_published ON feed_items(published DESC);");
execute("CREATE INDEX IF NOT EXISTS idx_feed_items_read ON feed_items(is_read);");
execute("CREATE INDEX IF NOT EXISTS idx_feed_items_starred ON feed_items(is_starred);");
// Create search_history table
execute("CREATE TABLE IF NOT EXISTS search_history (id INTEGER PRIMARY KEY AUTOINCREMENT, query TEXT NOT NULL, filters_json TEXT, sort_option TEXT NOT NULL DEFAULT 'relevance', page INTEGER NOT NULL DEFAULT 1, page_size INTEGER NOT NULL DEFAULT 20, result_count INTEGER, created_at TEXT NOT NULL DEFAULT (datetime('now')));");
execute("CREATE INDEX IF NOT EXISTS idx_search_history_created ON search_history(created_at DESC);");
// Create FTS5 virtual table
execute("CREATE VIRTUAL TABLE IF NOT EXISTS feed_items_fts USING fts5(title, description, content, author, content='feed_items', content_rowid='rowid');");
// Create triggers for FTS sync
execute("CREATE TRIGGER IF NOT EXISTS feed_items_ai AFTER INSERT ON feed_items BEGIN INSERT INTO feed_items_fts(rowid, title, description, content, author) VALUES (new.rowid, new.title, new.description, new.content, new.author); END;");
execute("CREATE TRIGGER IF NOT EXISTS feed_items_ad AFTER DELETE ON feed_items BEGIN INSERT INTO feed_items_fts(feed_items_fts, rowid, title, description, content, author) VALUES('delete', old.rowid, old.title, old.description, old.content, old.author); END;");
execute("CREATE TRIGGER IF NOT EXISTS feed_items_au AFTER UPDATE ON feed_items BEGIN INSERT INTO feed_items_fts(feed_items_fts, rowid, title, description, content, author) VALUES('delete', old.rowid, old.title, old.description, old.content, old.author); INSERT INTO feed_items_fts(rowid, title, description, content, author) VALUES (new.rowid, new.title, new.description, new.content, new.author); END;");
// Record migration
execute("INSERT OR REPLACE INTO schema_migrations (version, applied_at) VALUES (" + CURRENT_VERSION.to_string() + ", datetime('now'));");
debug("Database migrated to version %d", CURRENT_VERSION);
}
/**
* Get current migration version
*/
private int get_current_version() throws Error {
try {
Sqlite.Statement stmt;
int result = db.prepare_v2("SELECT COALESCE(MAX(version), 0) FROM schema_migrations;", -1, out stmt, null);
if (result != Sqlite.OK) {
throw new DBError.FAILED("Failed to prepare statement: %s".printf(db.errmsg()));
}
int version = 0;
if (stmt.step() == Sqlite.ROW) {
version = stmt.column_int(0);
}
return version;
} catch (Error e) {
throw new DBError.FAILED("Failed to get migration version: %s".printf(e.message));
}
}
/**
* Execute a SQL statement
*/
public void execute(string sql) throws Error {
string? errmsg;
int result = db.exec(sql, null, out errmsg);
if (result != Sqlite.OK) {
throw new DBError.FAILED("SQL execution failed: %s\nSQL: %s".printf(errmsg, sql));
}
}
/**
* Prepare a SQL statement
*/
public Sqlite.Statement prepare(string sql) throws Error {
Sqlite.Statement stmt;
int result = db.prepare_v2(sql, -1, out stmt, null);
if (result != Sqlite.OK) {
throw new DBError.FAILED("Failed to prepare statement: %s\nSQL: %s".printf(db.errmsg(), sql));
}
return stmt;
}
/**
* Get the database connection handle
*/
public unowned Sqlite.Database get_handle() {
return db;
}
/**
* Close database connection
*/
public void close() {
if (db != null) {
db = null;
debug("Database closed: %s", db_path);
}
}
/**
* Begin a transaction
*/
public void begin_transaction() throws Error {
execute("BEGIN TRANSACTION;");
}
/**
* Commit a transaction
*/
public void commit() throws Error {
execute("COMMIT;");
}
/**
* Rollback a transaction
*/
public void rollback() throws Error {
execute("ROLLBACK;");
}
/* Helper to convert GLib.List to array */
private T[] toArray<T>(GLib.List<T> list) {
T[] arr = {};
for (unowned var node = list; node != null; node = node.next) {
arr += node.data;
}
return arr;
}
}

View File

@@ -1,24 +0,0 @@
/*
* DBError.vala
*
* Database error domain definition.
*/
namespace RSSuper {
/**
* DBError - Database error domain
*/
public errordomain DBError {
FAILED, /** Generic database operation failed */
NOT_FOUND, /** Record not found */
DUPLICATE, /** Duplicate key or record */
CORRUPTED, /** Database is corrupted */
TIMEOUT, /** Operation timed out */
INVALID_STATE, /** Invalid database state */
MIGRATION_FAILED, /** Migration failed */
CONSTRAINT_FAILED, /** Constraint violation */
FOREIGN_KEY_FAILED, /** Foreign key constraint failed */
UNIQUE_FAILED, /** Unique constraint failed */
CHECK_FAILED, /** Check constraint failed */
}
}

View File

@@ -1,416 +0,0 @@
/*
* FeedItemStore.vala
*
* CRUD operations for feed items with FTS search support.
*/
/**
* FeedItemStore - Manages feed item persistence
*/
public class RSSuper.FeedItemStore : Object {
private Database db;
/**
* Signal emitted when an item is added
*/
public signal void item_added(FeedItem item);
/**
* Signal emitted when an item is updated
*/
public signal void item_updated(FeedItem item);
/**
* Signal emitted when an item is deleted
*/
public signal void item_deleted(string id);
/**
* Create a new feed item store
*/
public FeedItemStore(Database db) {
this.db = db;
}
/**
* Add a new feed item
*/
public FeedItem add(FeedItem item) throws Error {
var stmt = db.prepare(
"INSERT INTO feed_items (id, subscription_id, title, link, description, content, " +
"author, published, updated, categories, enclosure_url, enclosure_type, " +
"enclosure_length, guid, is_read, is_starred) " +
"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);"
);
stmt.bind_text(1, item.id, -1, null);
stmt.bind_text(2, item.subscription_title ?? "", -1, null);
stmt.bind_text(3, item.title, -1, null);
stmt.bind_text(4, item.link ?? "", -1, null);
stmt.bind_text(5, item.description ?? "", -1, null);
stmt.bind_text(6, item.content ?? "", -1, null);
stmt.bind_text(7, item.author ?? "", -1, null);
stmt.bind_text(8, item.published ?? "", -1, null);
stmt.bind_text(9, item.updated ?? "", -1, null);
stmt.bind_text(10, format_categories(item.categories), -1, null);
stmt.bind_text(11, item.enclosure_url ?? "", -1, null);
stmt.bind_text(12, item.enclosure_type ?? "", -1, null);
stmt.bind_text(13, item.enclosure_length ?? "", -1, null);
stmt.bind_text(14, item.guid ?? "", -1, null);
stmt.bind_int(15, 0); // is_read
stmt.bind_int(16, 0); // is_starred
stmt.step();
debug("Feed item added: %s", item.id);
item_added(item);
return item;
}
/**
* Add multiple items in a batch
*/
public void add_batch(FeedItem[] items) throws Error {
db.begin_transaction();
try {
foreach (var item in items) {
add(item);
}
db.commit();
debug("Batch insert completed: %d items", items.length);
} catch (Error e) {
db.rollback();
throw new DBError.FAILED("Transaction failed: %s".printf(e.message));
}
}
/**
* Get an item by ID
*/
public FeedItem? get_by_id(string id) throws Error {
var stmt = db.prepare(
"SELECT id, subscription_id, title, link, description, content, author, " +
"published, updated, categories, enclosure_url, enclosure_type, " +
"enclosure_length, guid, is_read, is_starred " +
"FROM feed_items WHERE id = ?;"
);
stmt.bind_text(1, id, -1, null);
if (stmt.step() == Sqlite.ROW) {
return row_to_item(stmt);
}
return null;
}
/**
* Get items by subscription ID
*/
public FeedItem[] get_by_subscription(string subscription_id) throws Error {
var items = new GLib.List<FeedItem?>();
var stmt = db.prepare(
"SELECT id, subscription_id, title, link, description, content, author, " +
"published, updated, categories, enclosure_url, enclosure_type, " +
"enclosure_length, guid, is_read, is_starred " +
"FROM feed_items WHERE subscription_id = ? " +
"ORDER BY published DESC LIMIT 100;"
);
stmt.bind_text(1, subscription_id, -1, null);
while (stmt.step() == Sqlite.ROW) {
var item = row_to_item(stmt);
if (item != null) {
items.append(item);
}
}
return items_to_array(items);
}
/**
* Get all items
*/
public FeedItem[] get_all() throws Error {
var items = new GLib.List<FeedItem?>();
var stmt = db.prepare(
"SELECT id, subscription_id, title, link, description, content, author, " +
"published, updated, categories, enclosure_url, enclosure_type, " +
"enclosure_length, guid, is_read, is_starred " +
"FROM feed_items ORDER BY published DESC LIMIT 1000;"
);
while (stmt.step() == Sqlite.ROW) {
var item = row_to_item(stmt);
if (item != null) {
items.append(item);
}
}
return items_to_array(items);
}
/**
* Search items using FTS
*/
public FeedItem[] search(string query, int limit = 50) throws Error {
var items = new GLib.List<FeedItem?>();
var stmt = db.prepare(
"SELECT f.id, f.subscription_id, f.title, f.link, f.description, f.content, " +
"f.author, f.published, f.updated, f.categories, f.enclosure_url, " +
"f.enclosure_type, f.enclosure_length, f.guid, f.is_read, f.is_starred " +
"FROM feed_items_fts t " +
"JOIN feed_items f ON t.rowid = f.rowid " +
"WHERE feed_items_fts MATCH ? " +
"ORDER BY rank " +
"LIMIT ?;"
);
stmt.bind_text(1, query, -1, null);
stmt.bind_int(2, limit);
while (stmt.step() == Sqlite.ROW) {
var item = row_to_item(stmt);
if (item != null) {
items.append(item);
}
}
return items_to_array(items);
}
/**
* Mark an item as read
*/
public void mark_as_read(string id) throws Error {
var stmt = db.prepare("UPDATE feed_items SET is_read = 1 WHERE id = ?;");
stmt.bind_text(1, id, -1, null);
stmt.step();
debug("Item marked as read: %s", id);
}
/**
* Mark an item as unread
*/
public void mark_as_unread(string id) throws Error {
var stmt = db.prepare("UPDATE feed_items SET is_read = 0 WHERE id = ?;");
stmt.bind_text(1, id, -1, null);
stmt.step();
debug("Item marked as unread: %s", id);
}
/**
* Mark an item as starred
*/
public void mark_as_starred(string id) throws Error {
var stmt = db.prepare("UPDATE feed_items SET is_starred = 1 WHERE id = ?;");
stmt.bind_text(1, id, -1, null);
stmt.step();
debug("Item starred: %s", id);
}
/**
* Unmark an item from starred
*/
public void unmark_starred(string id) throws Error {
var stmt = db.prepare("UPDATE feed_items SET is_starred = 0 WHERE id = ?;");
stmt.bind_text(1, id, -1, null);
stmt.step();
debug("Item unstarred: %s", id);
}
/**
* Get unread items
*/
public FeedItem[] get_unread() throws Error {
var items = new GLib.List<FeedItem?>();
var stmt = db.prepare(
"SELECT id, subscription_id, title, link, description, content, author, " +
"published, updated, categories, enclosure_url, enclosure_type, " +
"enclosure_length, guid, is_read, is_starred " +
"FROM feed_items WHERE is_read = 0 " +
"ORDER BY published DESC LIMIT 100;"
);
while (stmt.step() == Sqlite.ROW) {
var item = row_to_item(stmt);
if (item != null) {
items.append(item);
}
}
return items_to_array(items);
}
/**
* Get starred items
*/
public FeedItem[] get_starred() throws Error {
var items = new GLib.List<FeedItem?>();
var stmt = db.prepare(
"SELECT id, subscription_id, title, link, description, content, author, " +
"published, updated, categories, enclosure_url, enclosure_type, " +
"enclosure_length, guid, is_read, is_starred " +
"FROM feed_items WHERE is_starred = 1 " +
"ORDER BY published DESC LIMIT 100;"
);
while (stmt.step() == Sqlite.ROW) {
var item = row_to_item(stmt);
if (item != null) {
items.append(item);
}
}
return items_to_array(items);
}
/**
* Delete an item by ID
*/
public void delete(string id) throws Error {
var stmt = db.prepare("DELETE FROM feed_items WHERE id = ?;");
stmt.bind_text(1, id, -1, null);
stmt.step();
debug("Item deleted: %s", id);
item_deleted(id);
}
/**
* Delete items by subscription ID
*/
public void delete_by_subscription(string subscription_id) throws Error {
var stmt = db.prepare("DELETE FROM feed_items WHERE subscription_id = ?;");
stmt.bind_text(1, subscription_id, -1, null);
stmt.step();
debug("Items deleted for subscription: %s", subscription_id);
}
/**
* Delete old items (keep last N items per subscription)
*/
public void cleanup_old_items(int keep_count = 100) throws Error {
db.begin_transaction();
try {
var stmt = db.prepare(
"DELETE FROM feed_items WHERE id NOT IN (" +
"SELECT id FROM feed_items " +
"ORDER BY published DESC " +
"LIMIT -1 OFFSET ?" +
");"
);
stmt.bind_int(1, keep_count);
stmt.step();
db.commit();
debug("Old items cleaned up, kept %d", keep_count);
} catch (Error e) {
db.rollback();
throw new DBError.FAILED("Transaction failed: %s".printf(e.message));
}
}
/**
* Convert a database row to a FeedItem
*/
private FeedItem? row_to_item(Sqlite.Statement stmt) {
try {
string categories_str = stmt.column_text(9);
string[] categories = parse_categories(categories_str);
var item = new FeedItem.with_values(
stmt.column_text(0), // id
stmt.column_text(2), // title
stmt.column_text(3), // link
stmt.column_text(4), // description
stmt.column_text(5), // content
stmt.column_text(6), // author
stmt.column_text(7), // published
stmt.column_text(8), // updated
categories,
stmt.column_text(10), // enclosure_url
stmt.column_text(11), // enclosure_type
stmt.column_text(12), // enclosure_length
stmt.column_text(13), // guid
stmt.column_text(1) // subscription_id (stored as subscription_title)
);
return item;
} catch (Error e) {
warning("Failed to parse item row: %s", e.message);
return null;
}
}
/**
* Format categories array as JSON string
*/
private string format_categories(string[] categories) {
if (categories.length == 0) {
return "[]";
}
var sb = new StringBuilder();
sb.append("[");
for (var i = 0; i < categories.length; i++) {
if (i > 0) sb.append(",");
sb.append("\"");
sb.append(categories[i]);
sb.append("\"");
}
sb.append("]");
return sb.str;
}
/**
* Parse categories from JSON string
*/
private string[] parse_categories(string json) {
if (json == null || json.length == 0 || json == "[]") {
return {};
}
try {
var parser = new Json.Parser();
if (parser.load_from_data(json)) {
var node = parser.get_root();
if (node.get_node_type() == Json.NodeType.ARRAY) {
var array = node.get_array();
var categories = new string[array.get_length()];
for (var i = 0; i < array.get_length(); i++) {
categories[i] = array.get_string_element(i);
}
return categories;
}
}
} catch (Error e) {
warning("Failed to parse categories: %s", e.message);
}
return {};
}
private FeedItem[] items_to_array(GLib.List<FeedItem?> list) {
FeedItem[] arr = {};
for (unowned var node = list; node != null; node = node.next) {
if (node.data != null) arr += node.data;
}
return arr;
}
}

View File

@@ -1,103 +0,0 @@
-- RSSuper Database Schema
-- SQLite with FTS5 for full-text search
-- Enable foreign keys
PRAGMA foreign_keys = ON;
-- Migration tracking table
CREATE TABLE IF NOT EXISTS schema_migrations (
version INTEGER PRIMARY KEY,
applied_at TEXT NOT NULL DEFAULT (datetime('now'))
);
-- Feed subscriptions table
CREATE TABLE IF NOT EXISTS feed_subscriptions (
id TEXT PRIMARY KEY,
url TEXT NOT NULL UNIQUE,
title TEXT NOT NULL,
category TEXT,
enabled INTEGER NOT NULL DEFAULT 1,
fetch_interval INTEGER NOT NULL DEFAULT 60,
created_at TEXT NOT NULL,
updated_at TEXT NOT NULL,
last_fetched_at TEXT,
next_fetch_at TEXT,
error TEXT,
http_auth_username TEXT,
http_auth_password TEXT
);
-- Feed items table
CREATE TABLE IF NOT EXISTS feed_items (
id TEXT PRIMARY KEY,
subscription_id TEXT NOT NULL,
title TEXT NOT NULL,
link TEXT,
description TEXT,
content TEXT,
author TEXT,
published TEXT,
updated TEXT,
categories TEXT, -- JSON array as text
enclosure_url TEXT,
enclosure_type TEXT,
enclosure_length TEXT,
guid TEXT,
is_read INTEGER NOT NULL DEFAULT 0,
is_starred INTEGER NOT NULL DEFAULT 0,
created_at TEXT NOT NULL DEFAULT (datetime('now')),
FOREIGN KEY (subscription_id) REFERENCES feed_subscriptions(id) ON DELETE CASCADE
);
-- Create index for feed items
CREATE INDEX IF NOT EXISTS idx_feed_items_subscription ON feed_items(subscription_id);
CREATE INDEX IF NOT EXISTS idx_feed_items_published ON feed_items(published DESC);
CREATE INDEX IF NOT EXISTS idx_feed_items_read ON feed_items(is_read);
CREATE INDEX IF NOT EXISTS idx_feed_items_starred ON feed_items(is_starred);
-- Search history table
CREATE TABLE IF NOT EXISTS search_history (
id INTEGER PRIMARY KEY AUTOINCREMENT,
query TEXT NOT NULL,
filters_json TEXT,
sort_option TEXT NOT NULL DEFAULT 'relevance',
page INTEGER NOT NULL DEFAULT 1,
page_size INTEGER NOT NULL DEFAULT 20,
result_count INTEGER,
created_at TEXT NOT NULL DEFAULT (datetime('now'))
);
CREATE INDEX IF NOT EXISTS idx_search_history_created ON search_history(created_at DESC);
-- FTS5 virtual table for full-text search on feed items
CREATE VIRTUAL TABLE IF NOT EXISTS feed_items_fts USING fts5(
title,
description,
content,
author,
content='feed_items',
content_rowid='rowid'
);
-- Trigger to keep FTS table in sync on INSERT
CREATE TRIGGER IF NOT EXISTS feed_items_ai AFTER INSERT ON feed_items BEGIN
INSERT INTO feed_items_fts(rowid, title, description, content, author)
VALUES (new.rowid, new.title, new.description, new.content, new.author);
END;
-- Trigger to keep FTS table in sync on DELETE
CREATE TRIGGER IF NOT EXISTS feed_items_ad AFTER DELETE ON feed_items BEGIN
INSERT INTO feed_items_fts(feed_items_fts, rowid, title, description, content, author)
VALUES('delete', old.rowid, old.title, old.description, old.content, old.author);
END;
-- Trigger to keep FTS table in sync on UPDATE
CREATE TRIGGER IF NOT EXISTS feed_items_au AFTER UPDATE ON feed_items BEGIN
INSERT INTO feed_items_fts(feed_items_fts, rowid, title, description, content, author)
VALUES('delete', old.rowid, old.title, old.description, old.content, old.author);
INSERT INTO feed_items_fts(rowid, title, description, content, author)
VALUES (new.rowid, new.title, new.description, new.content, new.author);
END;
-- Initial migration record
INSERT OR IGNORE INTO schema_migrations (version) VALUES (1);

View File

@@ -1,171 +0,0 @@
/*
* SearchHistoryStore.vala
*
* CRUD operations for search history.
*/
/**
* SearchHistoryStore - Manages search history persistence
*/
public class RSSuper.SearchHistoryStore : Object {
private Database db;
/**
* Maximum number of history entries to keep
*/
public int max_entries { get; set; default = 100; }
/**
* Signal emitted when a search is recorded
*/
public signal void search_recorded(SearchQuery query, int result_count);
/**
* Signal emitted when history is cleared
*/
public signal void history_cleared();
/**
* Create a new search history store
*/
public SearchHistoryStore(Database db) {
this.db = db;
}
/**
* Record a search query
*/
public int record_search(SearchQuery query, int result_count = 0) throws Error {
var stmt = db.prepare(
"INSERT INTO search_history (query, filters_json, sort_option, page, page_size, result_count) " +
"VALUES (?, ?, ?, ?, ?, ?);"
);
stmt.bind_text(1, query.query, -1, null);
stmt.bind_text(2, query.filters_json ?? "", -1, null);
stmt.bind_text(3, SearchFilters.sort_option_to_string(query.sort), -1, null);
stmt.bind_int(4, query.page);
stmt.bind_int(5, query.page_size);
stmt.bind_int(6, result_count);
stmt.step();
debug("Search recorded: %s (%d results)", query.query, result_count);
search_recorded(query, result_count);
// Clean up old entries if needed
cleanup_old_entries();
return 0; // Returns the last inserted row ID in SQLite
}
/**
* Get search history
*/
public SearchQuery[] get_history(int limit = 50) throws Error {
var queries = new GLib.List<SearchQuery?>();
var stmt = db.prepare(
"SELECT query, filters_json, sort_option, page, page_size, result_count, created_at " +
"FROM search_history " +
"ORDER BY created_at DESC " +
"LIMIT ?;"
);
stmt.bind_int(1, limit);
while (stmt.step() == Sqlite.ROW) {
var query = row_to_query(stmt);
queries.append(query);
}
return queries_to_array(queries);
}
/**
* Get recent searches (last 24 hours)
*/
public SearchQuery[] get_recent() throws Error {
var queries = new GLib.List<SearchQuery?>();
var now = new DateTime.now_local();
var yesterday = now.add_days(-1);
var threshold = yesterday.format("%Y-%m-%dT%H:%M:%S");
var stmt = db.prepare(
"SELECT query, filters_json, sort_option, page, page_size, result_count, created_at " +
"FROM search_history " +
"WHERE created_at >= ? " +
"ORDER BY created_at DESC " +
"LIMIT 20;"
);
stmt.bind_text(1, threshold, -1, null);
while (stmt.step() == Sqlite.ROW) {
var query = row_to_query(stmt);
queries.append(query);
}
return queries_to_array(queries);
}
/**
* Delete a search history entry by ID
*/
public void delete(int id) throws Error {
var stmt = db.prepare("DELETE FROM search_history WHERE id = ?;");
stmt.bind_int(1, id);
stmt.step();
debug("Search history entry deleted: %d", id);
}
/**
* Clear all search history
*/
public void clear() throws Error {
var stmt = db.prepare("DELETE FROM search_history;");
stmt.step();
debug("Search history cleared");
history_cleared();
}
/**
* Clear old search history entries
*/
private void cleanup_old_entries() throws Error {
var stmt = db.prepare(
"DELETE FROM search_history WHERE id NOT IN (" +
"SELECT id FROM search_history ORDER BY created_at DESC LIMIT ?" +
");"
);
stmt.bind_int(1, max_entries);
stmt.step();
}
/**
* Convert a database row to a SearchQuery
*/
private SearchQuery row_to_query(Sqlite.Statement stmt) {
string query_str = stmt.column_text(0);
string? filters_json = stmt.column_text(1);
string sort_str = stmt.column_text(2);
int page = stmt.column_int(3);
int page_size = stmt.column_int(4);
return SearchQuery(query_str, page, page_size, filters_json,
SearchFilters.sort_option_from_string(sort_str));
}
private SearchQuery[] queries_to_array(GLib.List<SearchQuery?> list) {
SearchQuery[] arr = {};
for (unowned var node = list; node != null; node = node.next) {
arr += node.data;
}
return arr;
}
}

View File

@@ -1,69 +0,0 @@
/*
* SQLite3 C API bindings for Vala
*/
[CCode (cheader_filename = "sqlite3.h")]
namespace SQLite {
[CCode (cname = "sqlite3", free_function = "sqlite3_close")]
public class DB {
[CCode (cname = "sqlite3_open")]
public static int open(string filename, out DB db);
[CCode (cname = "sqlite3_close")]
public int close();
[CCode (cname = "sqlite3_exec")]
public int exec(string sql, DBCallback? callback = null, void* arg = null, [CCode (array_length = false)] out string? errmsg = null);
[CCode (cname = "sqlite3_errmsg")]
public unowned string errmsg();
[CCode (cname = "sqlite3_prepare_v2")]
public int prepare_v2(string zSql, int nByte, out Stmt stmt, void* pzTail = null);
}
[CCode (cname = "sqlite3_stmt", free_function = "sqlite3_finalize")]
public class Stmt {
[CCode (cname = "sqlite3_step")]
public int step();
[CCode (cname = "sqlite3_column_count")]
public int column_count();
[CCode (cname = "sqlite3_column_text")]
public unowned string column_text(int i);
[CCode (cname = "sqlite3_column_int")]
public int column_int(int i);
[CCode (cname = "sqlite3_column_double")]
public double column_double(int i);
[CCode (cname = "sqlite3_bind_text")]
public int bind_text(int i, string z, int n, void* x);
[CCode (cname = "sqlite3_bind_int")]
public int bind_int(int i, int val);
[CCode (cname = "sqlite3_bind_double")]
public int bind_double(int i, double val);
[CCode (cname = "sqlite3_bind_null")]
public int bind_null(int i);
[CCode (cname = "sqlite3_finalize")]
public int finalize();
}
[CCode (cname = "SQLITE_OK")]
public const int SQLITE_OK;
[CCode (cname = "SQLITE_ROW")]
public const int SQLITE_ROW;
[CCode (cname = "SQLITE_DONE")]
public const int SQLITE_DONE;
[CCode (cname = "SQLITE_ERROR")]
public const int SQLITE_ERROR;
[CCode (simple_type = true)]
public delegate int DBCallback(void* arg, int argc, string[] argv, string[] col_names);
}

View File

@@ -1,244 +0,0 @@
/*
* SubscriptionStore.vala
*
* CRUD operations for feed subscriptions.
*/
/**
* SubscriptionStore - Manages feed subscription persistence
*/
public class RSSuper.SubscriptionStore : Object {
private Database db;
/**
* Signal emitted when a subscription is added
*/
public signal void subscription_added(FeedSubscription subscription);
/**
* Signal emitted when a subscription is updated
*/
public signal void subscription_updated(FeedSubscription subscription);
/**
* Signal emitted when a subscription is deleted
*/
public signal void subscription_deleted(string id);
/**
* Create a new subscription store
*/
public SubscriptionStore(Database db) {
this.db = db;
}
/**
* Add a new subscription
*/
public FeedSubscription add(FeedSubscription subscription) throws Error {
var stmt = db.prepare(
"INSERT INTO feed_subscriptions (id, url, title, category, enabled, fetch_interval, " +
"created_at, updated_at, last_fetched_at, next_fetch_at, error, http_auth_username, http_auth_password) " +
"VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?);"
);
stmt.bind_text(1, subscription.id, -1, null);
stmt.bind_text(2, subscription.url, -1, null);
stmt.bind_text(3, subscription.title, -1, null);
stmt.bind_text(4, subscription.category ?? "", -1, null);
stmt.bind_int(5, subscription.enabled ? 1 : 0);
stmt.bind_int(6, subscription.fetch_interval);
stmt.bind_text(7, subscription.created_at, -1, null);
stmt.bind_text(8, subscription.updated_at, -1, null);
stmt.bind_text(9, subscription.last_fetched_at ?? "", -1, null);
stmt.bind_text(10, subscription.next_fetch_at ?? "", -1, null);
stmt.bind_text(11, subscription.error ?? "", -1, null);
stmt.bind_text(12, subscription.http_auth_username ?? "", -1, null);
stmt.bind_text(13, subscription.http_auth_password ?? "", -1, null);
stmt.step();
debug("Subscription added: %s", subscription.id);
subscription_added(subscription);
return subscription;
}
/**
* Get a subscription by ID
*/
public FeedSubscription? get_by_id(string id) throws Error {
var stmt = db.prepare(
"SELECT id, url, title, category, enabled, fetch_interval, created_at, updated_at, " +
"last_fetched_at, next_fetch_at, error, http_auth_username, http_auth_password " +
"FROM feed_subscriptions WHERE id = ?;"
);
stmt.bind_text(1, id, -1, null);
if (stmt.step() == Sqlite.ROW) {
return row_to_subscription(stmt);
}
return null;
}
/**
* Get all subscriptions
*/
public FeedSubscription[] get_all() throws Error {
var subscriptions = new GLib.List<FeedSubscription?>();
var stmt = db.prepare(
"SELECT id, url, title, category, enabled, fetch_interval, created_at, updated_at, " +
"last_fetched_at, next_fetch_at, error, http_auth_username, http_auth_password " +
"FROM feed_subscriptions ORDER BY title;"
);
while (stmt.step() == Sqlite.ROW) {
var subscription = row_to_subscription(stmt);
if (subscription != null) {
subscriptions.append(subscription);
}
}
return list_to_array(subscriptions);
}
/**
* Update a subscription
*/
public void update(FeedSubscription subscription) throws Error {
var stmt = db.prepare(
"UPDATE feed_subscriptions SET url = ?, title = ?, category = ?, enabled = ?, " +
"fetch_interval = ?, updated_at = ?, last_fetched_at = ?, next_fetch_at = ?, " +
"error = ?, http_auth_username = ?, http_auth_password = ? " +
"WHERE id = ?;"
);
stmt.bind_text(1, subscription.url, -1, null);
stmt.bind_text(2, subscription.title, -1, null);
stmt.bind_text(3, subscription.category ?? "", -1, null);
stmt.bind_int(4, subscription.enabled ? 1 : 0);
stmt.bind_int(5, subscription.fetch_interval);
stmt.bind_text(6, subscription.updated_at, -1, null);
stmt.bind_text(7, subscription.last_fetched_at ?? "", -1, null);
stmt.bind_text(8, subscription.next_fetch_at ?? "", -1, null);
stmt.bind_text(9, subscription.error ?? "", -1, null);
stmt.bind_text(10, subscription.http_auth_username ?? "", -1, null);
stmt.bind_text(11, subscription.http_auth_password ?? "", -1, null);
stmt.bind_text(12, subscription.id, -1, null);
stmt.step();
debug("Subscription updated: %s", subscription.id);
subscription_updated(subscription);
}
/**
* Delete a subscription
*/
public void remove_subscription(string id) throws Error {
var stmt = db.prepare("DELETE FROM feed_subscriptions WHERE id = ?;");
stmt.bind_text(1, id, -1, null);
stmt.step();
debug("Subscription deleted: %s", id);
subscription_deleted(id);
}
/**
* Delete a subscription by object
*/
public void delete_subscription(FeedSubscription subscription) throws Error {
remove_subscription(subscription.id);
}
/**
* Get enabled subscriptions
*/
public FeedSubscription[] get_enabled() throws Error {
var subscriptions = new GLib.List<FeedSubscription?>();
var stmt = db.prepare(
"SELECT id, url, title, category, enabled, fetch_interval, created_at, updated_at, " +
"last_fetched_at, next_fetch_at, error, http_auth_username, http_auth_password " +
"FROM feed_subscriptions WHERE enabled = 1 ORDER BY title;"
);
while (stmt.step() == Sqlite.ROW) {
var subscription = row_to_subscription(stmt);
if (subscription != null) {
subscriptions.append(subscription);
}
}
return list_to_array(subscriptions);
}
/**
* Get subscriptions that need fetching
*/
public FeedSubscription[] get_due_for_fetch() throws Error {
var subscriptions = new GLib.List<FeedSubscription?>();
var now = new DateTime.now_local();
var now_str = now.format("%Y-%m-%dT%H:%M:%S");
var stmt = db.prepare(
"SELECT id, url, title, category, enabled, fetch_interval, created_at, updated_at, " +
"last_fetched_at, next_fetch_at, error, http_auth_username, http_auth_password " +
"FROM feed_subscriptions WHERE enabled = 1 AND " +
"(next_fetch_at IS NULL OR next_fetch_at <= ?) " +
"ORDER BY next_fetch_at ASC;"
);
stmt.bind_text(1, now_str, -1, null);
while (stmt.step() == Sqlite.ROW) {
var subscription = row_to_subscription(stmt);
if (subscription != null) {
subscriptions.append(subscription);
}
}
return list_to_array(subscriptions);
}
/**
* Convert a database row to a FeedSubscription
*/
private FeedSubscription? row_to_subscription(Sqlite.Statement stmt) {
try {
var subscription = new FeedSubscription.with_values(
stmt.column_text(0), // id
stmt.column_text(1), // url
stmt.column_text(2), // title
stmt.column_int(5), // fetch_interval
stmt.column_text(3), // category
stmt.column_int(4) == 1, // enabled
stmt.column_text(6), // created_at
stmt.column_text(7), // updated_at
stmt.column_text(8), // last_fetched_at
stmt.column_text(9), // next_fetch_at
stmt.column_text(10), // error
stmt.column_text(11), // http_auth_username
stmt.column_text(12) // http_auth_password
);
return subscription;
} catch (Error e) {
warning("Failed to parse subscription row: %s", e.message);
return null;
}
}
private FeedSubscription[] list_to_array(GLib.List<FeedSubscription?> list) {
FeedSubscription[] arr = {};
for (unowned var node = list; node != null; node = node.next) {
if (node.data != null) arr += node.data;
}
return arr;
}
}

View File

@@ -1,313 +0,0 @@
/*
* FeedItem.vala
*
* Represents a single RSS/Atom feed item (article, episode, etc.)
* Following GNOME HIG naming conventions and Vala/GObject patterns.
*/
/**
* Enclosure metadata for media attachments (podcasts, videos, etc.)
*/
public struct RSSuper.Enclosure {
public string url { get; set; }
public string item_type { get; set; }
public string? length { get; set; }
public Enclosure(string url, string type, string? length = null) {
this.url = url;
this.item_type = type;
this.length = length;
}
}
/**
* FeedItem - Represents a single RSS/Atom entry
*/
public class RSSuper.FeedItem : Object {
public string id { get; set; }
public string title { get; set; }
public string? link { get; set; }
public string? description { get; set; }
public string? content { get; set; }
public string? author { get; set; }
public string? published { get; set; }
public string? updated { get; set; }
public string[] categories { get; set; }
public string? enclosure_url { get; set; }
public string? enclosure_type { get; set; }
public string? enclosure_length { get; set; }
public string? guid { get; set; }
public string? subscription_title { get; set; }
/**
* Default constructor
*/
public FeedItem() {
this.id = "";
this.title = "";
this.categories = {};
}
/**
* Constructor with initial values
*/
public FeedItem.with_values(string id, string title, string? link = null,
string? description = null, string? content = null,
string? author = null, string? published = null,
string? updated = null, string[]? categories = null,
string? enclosure_url = null, string? enclosure_type = null,
string? enclosure_length = null, string? guid = null,
string? subscription_title = null) {
this.id = id;
this.title = title;
this.link = link;
this.description = description;
this.content = content;
this.author = author;
this.published = published;
this.updated = updated;
this.categories = categories;
this.enclosure_url = enclosure_url;
this.enclosure_type = enclosure_type;
this.enclosure_length = enclosure_length;
this.guid = guid;
this.subscription_title = subscription_title;
}
/**
* Get enclosure as struct
*/
public Enclosure? get_enclosure() {
if (this.enclosure_url == null) {
return null;
}
return Enclosure(this.enclosure_url, this.enclosure_type ?? "", this.enclosure_length);
}
/**
* Set enclosure from struct
*/
public void set_enclosure(Enclosure? enclosure) {
if (enclosure == null) {
this.enclosure_url = null;
this.enclosure_type = null;
this.enclosure_length = null;
} else {
this.enclosure_url = enclosure.url;
this.enclosure_type = enclosure_type;
this.enclosure_length = enclosure.length;
}
}
/**
* Serialize to JSON string
*/
public string to_json_string() {
var sb = new StringBuilder();
sb.append("{");
sb.append("\"id\":\"");
sb.append(this.id);
sb.append("\",\"title\":\"");
sb.append(this.title);
sb.append("\"");
if (this.link != null) {
sb.append(",\"link\":\"");
sb.append(this.link);
sb.append("\"");
}
if (this.description != null) {
sb.append(",\"description\":\"");
sb.append(this.description);
sb.append("\"");
}
if (this.content != null) {
sb.append(",\"content\":\"");
sb.append(this.content);
sb.append("\"");
}
if (this.author != null) {
sb.append(",\"author\":\"");
sb.append(this.author);
sb.append("\"");
}
if (this.published != null) {
sb.append(",\"published\":\"");
sb.append(this.published);
sb.append("\"");
}
if (this.updated != null) {
sb.append(",\"updated\":\"");
sb.append(this.updated);
sb.append("\"");
}
if (this.categories.length > 0) {
sb.append(",\"categories\":[");
for (var i = 0; i < this.categories.length; i++) {
if (i > 0) sb.append(",");
sb.append("\"");
sb.append(this.categories[i]);
sb.append("\"");
}
sb.append("]");
}
if (this.enclosure_url != null) {
sb.append(",\"enclosure\":{\"url\":\"");
sb.append(this.enclosure_url);
sb.append("\"");
if (this.enclosure_type != null) {
sb.append(",\"type\":\"");
sb.append(this.enclosure_type);
sb.append("\"");
}
if (this.enclosure_length != null) {
sb.append(",\"length\":\"");
sb.append(this.enclosure_length);
sb.append("\"");
}
sb.append("}");
}
if (this.guid != null) {
sb.append(",\"guid\":\"");
sb.append(this.guid);
sb.append("\"");
}
if (this.subscription_title != null) {
sb.append(",\"subscription_title\":\"");
sb.append(this.subscription_title);
sb.append("\"");
}
sb.append("}");
return sb.str;
}
/**
* Deserialize from JSON string (simple parser)
*/
public static FeedItem? from_json_string(string json_string) {
var parser = new Json.Parser();
try {
if (!parser.load_from_data(json_string)) {
return null;
}
} catch (Error e) {
warning("Failed to parse JSON: %s", e.message);
return null;
}
return from_json_node(parser.get_root());
}
/**
* Deserialize from Json.Node
*/
public static FeedItem? from_json_node(Json.Node node) {
if (node.get_node_type() != Json.NodeType.OBJECT) {
return null;
}
var obj = node.get_object();
if (!obj.has_member("id") || !obj.has_member("title")) {
return null;
}
var item = new FeedItem();
item.id = obj.get_string_member("id");
item.title = obj.get_string_member("title");
if (obj.has_member("link")) {
item.link = obj.get_string_member("link");
}
if (obj.has_member("description")) {
item.description = obj.get_string_member("description");
}
if (obj.has_member("content")) {
item.content = obj.get_string_member("content");
}
if (obj.has_member("author")) {
item.author = obj.get_string_member("author");
}
if (obj.has_member("published")) {
item.published = obj.get_string_member("published");
}
if (obj.has_member("updated")) {
item.updated = obj.get_string_member("updated");
}
if (obj.has_member("categories")) {
var categories_array = obj.get_array_member("categories");
var categories = new string[categories_array.get_length()];
for (var i = 0; i < categories_array.get_length(); i++) {
categories[i] = categories_array.get_string_element(i);
}
item.categories = categories;
}
if (obj.has_member("enclosure")) {
var enclosure_obj = obj.get_object_member("enclosure");
item.enclosure_url = enclosure_obj.get_string_member("url");
if (enclosure_obj.has_member("type")) {
item.enclosure_type = enclosure_obj.get_string_member("type");
}
if (enclosure_obj.has_member("length")) {
item.enclosure_length = enclosure_obj.get_string_member("length");
}
}
if (obj.has_member("guid")) {
item.guid = obj.get_string_member("guid");
}
if (obj.has_member("subscription_title")) {
item.subscription_title = obj.get_string_member("subscription_title");
}
return item;
}
/**
* Equality comparison
*/
public bool equals(FeedItem? other) {
if (other == null) {
return false;
}
return this.id == other.id &&
this.title == other.title &&
this.link == other.link &&
this.description == other.description &&
this.content == other.content &&
this.author == other.author &&
this.published == other.published &&
this.updated == other.updated &&
this.categories_equal(other.categories) &&
this.enclosure_url == other.enclosure_url &&
this.enclosure_type == other.enclosure_type &&
this.enclosure_length == other.enclosure_length &&
this.guid == other.guid &&
this.subscription_title == other.subscription_title;
}
/**
* Helper for category array comparison
*/
private bool categories_equal(string[] other) {
if (this.categories.length != other.length) {
return false;
}
for (var i = 0; i < this.categories.length; i++) {
if (this.categories[i] != other[i]) {
return false;
}
}
return true;
}
/**
* Get a human-readable summary
*/
public string get_summary() {
return "%s by %s".printf(this.title, this.author ?? "Unknown");
}
}

View File

@@ -1,259 +0,0 @@
/*
* FeedSubscription.vala
*
* Represents a user's subscription to a feed with sync settings.
* Following GNOME HIG naming conventions and Vala/GObject patterns.
*/
/**
* HTTP Authentication credentials
*/
public struct RSSuper.HttpAuth {
public string username { get; set; }
public string password { get; set; }
public HttpAuth(string username, string password) {
this.username = username;
this.password = password;
}
}
/**
* FeedSubscription - Represents a user's subscription to a feed
*/
public class RSSuper.FeedSubscription : Object {
public string id { get; set; }
public string url { get; set; }
public string title { get; set; }
public string? category { get; set; }
public bool enabled { get; set; }
public int fetch_interval { get; set; }
public string created_at { get; set; }
public string updated_at { get; set; }
public string? last_fetched_at { get; set; }
public string? next_fetch_at { get; set; }
public string? error { get; set; }
public string? http_auth_username { get; set; }
public string? http_auth_password { get; set; }
/**
* Default constructor
*/
public FeedSubscription() {
this.id = "";
this.url = "";
this.title = "";
this.enabled = true;
this.fetch_interval = 60;
this.created_at = "";
this.updated_at = "";
}
/**
* Constructor with initial values
*/
public FeedSubscription.with_values(string id, string url, string title,
int fetch_interval = 60,
string? category = null, bool enabled = true,
string? created_at = null, string? updated_at = null,
string? last_fetched_at = null,
string? next_fetch_at = null,
string? error = null,
string? http_auth_username = null,
string? http_auth_password = null) {
this.id = id;
this.url = url;
this.title = title;
this.category = category;
this.enabled = enabled;
this.fetch_interval = fetch_interval;
this.created_at = created_at ?? "";
this.updated_at = updated_at ?? "";
this.last_fetched_at = last_fetched_at;
this.next_fetch_at = next_fetch_at;
this.error = error;
this.http_auth_username = http_auth_username;
this.http_auth_password = http_auth_password;
}
/**
* Get HTTP auth as struct
*/
public HttpAuth? get_http_auth() {
if (this.http_auth_username == null) {
return null;
}
return HttpAuth(this.http_auth_username, this.http_auth_password ?? "");
}
/**
* Set HTTP auth from struct
*/
public void set_http_auth(HttpAuth? auth) {
if (auth == null) {
this.http_auth_username = null;
this.http_auth_password = null;
} else {
this.http_auth_username = auth.username;
this.http_auth_password = auth.password;
}
}
/**
* Check if subscription has an error
*/
public bool has_error() {
return this.error != null && this.error.length > 0;
}
/**
* Serialize to JSON string
*/
public string to_json_string() {
var sb = new StringBuilder();
sb.append("{");
sb.append("\"id\":\"");
sb.append(this.id);
sb.append("\",\"url\":\"");
sb.append(this.url);
sb.append("\",\"title\":\"");
sb.append(this.title);
sb.append("\",\"enabled\":");
sb.append(this.enabled ? "true" : "false");
sb.append(",\"fetchInterval\":%d".printf(this.fetch_interval));
sb.append(",\"createdAt\":\"");
sb.append(this.created_at);
sb.append("\",\"updatedAt\":\"");
sb.append(this.updated_at);
sb.append("\"");
if (this.category != null) {
sb.append(",\"category\":\"");
sb.append(this.category);
sb.append("\"");
}
if (this.last_fetched_at != null) {
sb.append(",\"lastFetchedAt\":\"");
sb.append(this.last_fetched_at);
sb.append("\"");
}
if (this.next_fetch_at != null) {
sb.append(",\"nextFetchAt\":\"");
sb.append(this.next_fetch_at);
sb.append("\"");
}
if (this.error != null) {
sb.append(",\"error\":\"");
sb.append(this.error);
sb.append("\"");
}
if (this.http_auth_username != null) {
sb.append(",\"httpAuth\":{\"username\":\"");
sb.append(this.http_auth_username);
sb.append("\"");
if (this.http_auth_password != null) {
sb.append(",\"password\":\"");
sb.append(this.http_auth_password);
sb.append("\"");
}
sb.append("}");
}
sb.append("}");
return sb.str;
}
/**
* Deserialize from JSON string
*/
public static FeedSubscription? from_json_string(string json_string) {
var parser = new Json.Parser();
try {
if (!parser.load_from_data(json_string)) {
return null;
}
} catch (Error e) {
warning("Failed to parse JSON: %s", e.message);
return null;
}
return from_json_node(parser.get_root());
}
/**
* Deserialize from Json.Node
*/
public static FeedSubscription? from_json_node(Json.Node node) {
if (node.get_node_type() != Json.NodeType.OBJECT) {
return null;
}
var obj = node.get_object();
if (!obj.has_member("id") || !obj.has_member("url") ||
!obj.has_member("title") || !obj.has_member("createdAt") ||
!obj.has_member("updatedAt")) {
return null;
}
var subscription = new FeedSubscription();
subscription.id = obj.get_string_member("id");
subscription.url = obj.get_string_member("url");
subscription.title = obj.get_string_member("title");
if (obj.has_member("category")) {
subscription.category = obj.get_string_member("category");
}
if (obj.has_member("enabled")) {
subscription.enabled = obj.get_boolean_member("enabled");
}
if (obj.has_member("fetchInterval")) {
subscription.fetch_interval = (int)obj.get_int_member("fetchInterval");
}
subscription.created_at = obj.get_string_member("createdAt");
subscription.updated_at = obj.get_string_member("updatedAt");
if (obj.has_member("lastFetchedAt")) {
subscription.last_fetched_at = obj.get_string_member("lastFetchedAt");
}
if (obj.has_member("nextFetchAt")) {
subscription.next_fetch_at = obj.get_string_member("nextFetchAt");
}
if (obj.has_member("error")) {
subscription.error = obj.get_string_member("error");
}
if (obj.has_member("httpAuth")) {
var auth_obj = obj.get_object_member("httpAuth");
subscription.http_auth_username = auth_obj.get_string_member("username");
if (auth_obj.has_member("password")) {
subscription.http_auth_password = auth_obj.get_string_member("password");
}
}
return subscription;
}
/**
* Equality comparison
*/
public bool equals(FeedSubscription? other) {
if (other == null) {
return false;
}
return this.id == other.id &&
this.url == other.url &&
this.title == other.title &&
this.category == other.category &&
this.enabled == other.enabled &&
this.fetch_interval == other.fetch_interval &&
this.created_at == other.created_at &&
this.updated_at == other.updated_at &&
this.last_fetched_at == other.last_fetched_at &&
this.next_fetch_at == other.next_fetch_at &&
this.error == other.error &&
this.http_auth_username == other.http_auth_username &&
this.http_auth_password == other.http_auth_password;
}
}

View File

@@ -1,282 +0,0 @@
/*
* Feed.vala
*
* Represents an RSS/Atom feed with its metadata and items.
* Following GNOME HIG naming conventions and Vala/GObject patterns.
*/
/**
* Feed - Represents an RSS/Atom feed
*/
public class RSSuper.Feed : Object {
public string id { get; set; }
public string title { get; set; }
public string? link { get; set; }
public string? description { get; set; }
public string? subtitle { get; set; }
public string? language { get; set; }
public string? last_build_date { get; set; }
public string? updated { get; set; }
public string? generator { get; set; }
public int ttl { get; set; }
public string raw_url { get; set; }
public string? last_fetched_at { get; set; }
public string? next_fetch_at { get; set; }
public FeedItem[] items { get; set; }
/**
* Default constructor
*/
public Feed() {
this.id = "";
this.title = "";
this.raw_url = "";
this.ttl = 60;
this.items = {};
}
/**
* Constructor with initial values
*/
public Feed.with_values(string id, string title, string raw_url,
string? link = null, string? description = null,
string? subtitle = null, string? language = null,
string? last_build_date = null, string? updated = null,
string? generator = null, int ttl = 60,
FeedItem[]? items = null, string? last_fetched_at = null,
string? next_fetch_at = null) {
this.id = id;
this.title = title;
this.link = link;
this.description = description;
this.subtitle = subtitle;
this.language = language;
this.last_build_date = last_build_date;
this.updated = updated;
this.generator = generator;
this.ttl = ttl;
this.items = items;
this.raw_url = raw_url;
this.last_fetched_at = last_fetched_at;
this.next_fetch_at = next_fetch_at;
}
/**
* Add an item to the feed
*/
public void add_item(FeedItem item) {
var new_items = new FeedItem[this.items.length + 1];
for (var i = 0; i < this.items.length; i++) {
new_items[i] = this.items[i];
}
new_items[this.items.length] = item;
this.items = new_items;
}
/**
* Get item count
*/
public int get_item_count() {
return this.items.length;
}
/**
* Serialize to JSON string
*/
public string to_json_string() {
var sb = new StringBuilder();
sb.append("{");
sb.append("\"id\":\"");
sb.append(this.id);
sb.append("\",\"title\":\"");
sb.append(this.title);
sb.append("\",\"raw_url\":\"");
sb.append(this.raw_url);
sb.append("\"");
if (this.link != null) {
sb.append(",\"link\":\"");
sb.append(this.link);
sb.append("\"");
}
if (this.description != null) {
sb.append(",\"description\":\"");
sb.append(this.description);
sb.append("\"");
}
if (this.subtitle != null) {
sb.append(",\"subtitle\":\"");
sb.append(this.subtitle);
sb.append("\"");
}
if (this.language != null) {
sb.append(",\"language\":\"");
sb.append(this.language);
sb.append("\"");
}
if (this.last_build_date != null) {
sb.append(",\"lastBuildDate\":\"");
sb.append(this.last_build_date);
sb.append("\"");
}
if (this.updated != null) {
sb.append(",\"updated\":\"");
sb.append(this.updated);
sb.append("\"");
}
if (this.generator != null) {
sb.append(",\"generator\":\"");
sb.append(this.generator);
sb.append("\"");
}
if (this.ttl != 60) {
sb.append(",\"ttl\":%d".printf(this.ttl));
}
if (this.items.length > 0) {
sb.append(",\"items\":[");
for (var i = 0; i < this.items.length; i++) {
if (i > 0) sb.append(",");
sb.append(this.items[i].to_json_string());
}
sb.append("]");
}
if (this.last_fetched_at != null) {
sb.append(",\"lastFetchedAt\":\"");
sb.append(this.last_fetched_at);
sb.append("\"");
}
if (this.next_fetch_at != null) {
sb.append(",\"nextFetchAt\":\"");
sb.append(this.next_fetch_at);
sb.append("\"");
}
sb.append("}");
return sb.str;
}
/**
* Deserialize from JSON string
*/
public static Feed? from_json_string(string json_string) {
var parser = new Json.Parser();
try {
if (!parser.load_from_data(json_string)) {
return null;
}
} catch (Error e) {
warning("Failed to parse JSON: %s", e.message);
return null;
}
return from_json_node(parser.get_root());
}
/**
* Deserialize from Json.Node
*/
public static Feed? from_json_node(Json.Node node) {
if (node.get_node_type() != Json.NodeType.OBJECT) {
return null;
}
var obj = node.get_object();
if (!obj.has_member("id") || !obj.has_member("title") || !obj.has_member("raw_url")) {
return null;
}
var feed = new Feed();
feed.id = obj.get_string_member("id");
feed.title = obj.get_string_member("title");
feed.raw_url = obj.get_string_member("raw_url");
if (obj.has_member("link")) {
feed.link = obj.get_string_member("link");
}
if (obj.has_member("description")) {
feed.description = obj.get_string_member("description");
}
if (obj.has_member("subtitle")) {
feed.subtitle = obj.get_string_member("subtitle");
}
if (obj.has_member("language")) {
feed.language = obj.get_string_member("language");
}
if (obj.has_member("lastBuildDate")) {
feed.last_build_date = obj.get_string_member("lastBuildDate");
}
if (obj.has_member("updated")) {
feed.updated = obj.get_string_member("updated");
}
if (obj.has_member("generator")) {
feed.generator = obj.get_string_member("generator");
}
if (obj.has_member("ttl")) {
feed.ttl = (int)obj.get_int_member("ttl");
}
if (obj.has_member("lastFetchedAt")) {
feed.last_fetched_at = obj.get_string_member("lastFetchedAt");
}
if (obj.has_member("nextFetchAt")) {
feed.next_fetch_at = obj.get_string_member("nextFetchAt");
}
// Deserialize items
if (obj.has_member("items")) {
var items_array = obj.get_array_member("items");
var items = new FeedItem[items_array.get_length()];
for (var i = 0; i < items_array.get_length(); i++) {
var item_node = items_array.get_element(i);
var item = FeedItem.from_json_node(item_node);
if (item != null) {
items[i] = item;
}
}
feed.items = items;
}
return feed;
}
/**
* Equality comparison
*/
public bool equals(Feed? other) {
if (other == null) {
return false;
}
return this.id == other.id &&
this.title == other.title &&
this.link == other.link &&
this.description == other.description &&
this.subtitle == other.subtitle &&
this.language == other.language &&
this.last_build_date == other.last_build_date &&
this.updated == other.updated &&
this.generator == other.generator &&
this.ttl == other.ttl &&
this.raw_url == other.raw_url &&
this.last_fetched_at == other.last_fetched_at &&
this.next_fetch_at == other.next_fetch_at &&
this.items_equal(other.items);
}
/**
* Helper for item array comparison
*/
private bool items_equal(FeedItem[] other) {
if (this.items.length != other.length) {
return false;
}
for (var i = 0; i < this.items.length; i++) {
if (!this.items[i].equals(other[i])) {
return false;
}
}
return true;
}
}

View File

@@ -1,5 +0,0 @@
/*
* Namespace definition for RSSuper Linux models
*/
public namespace RSSuper {
}

View File

@@ -1,190 +0,0 @@
/*
* NotificationPreferences.vala
*
* Represents user notification preferences.
* Following GNOME HIG naming conventions and Vala/GObject patterns.
*/
/**
* NotificationPreferences - User notification settings
*/
public class RSSuper.NotificationPreferences : Object {
public bool new_articles { get; set; }
public bool episode_releases { get; set; }
public bool custom_alerts { get; set; }
public bool badge_count { get; set; }
public bool sound { get; set; }
public bool vibration { get; set; }
/**
* Default constructor (all enabled by default)
*/
public NotificationPreferences() {
this.new_articles = true;
this.episode_releases = true;
this.custom_alerts = true;
this.badge_count = true;
this.sound = true;
this.vibration = true;
}
/**
* Constructor with initial values
*/
public NotificationPreferences.with_values(bool new_articles = true,
bool episode_releases = true,
bool custom_alerts = true,
bool badge_count = true,
bool sound = true,
bool vibration = true) {
this.new_articles = new_articles;
this.episode_releases = episode_releases;
this.custom_alerts = custom_alerts;
this.badge_count = badge_count;
this.sound = sound;
this.vibration = vibration;
}
/**
* Enable all notifications
*/
public void enable_all() {
this.new_articles = true;
this.episode_releases = true;
this.custom_alerts = true;
this.badge_count = true;
this.sound = true;
this.vibration = true;
}
/**
* Disable all notifications
*/
public void disable_all() {
this.new_articles = false;
this.episode_releases = false;
this.custom_alerts = false;
this.badge_count = false;
this.sound = false;
this.vibration = false;
}
/**
* Check if any notifications are enabled
*/
public bool has_any_enabled() {
return this.new_articles ||
this.episode_releases ||
this.custom_alerts ||
this.badge_count ||
this.sound ||
this.vibration;
}
/**
* Check if content notifications are enabled
*/
public bool has_content_notifications() {
return this.new_articles || this.episode_releases || this.custom_alerts;
}
/**
* Serialize to JSON string
*/
public string to_json_string() {
var sb = new StringBuilder();
sb.append("{");
sb.append("\"newArticles\":");
sb.append(this.new_articles ? "true" : "false");
sb.append(",\"episodeReleases\":");
sb.append(this.episode_releases ? "true" : "false");
sb.append(",\"customAlerts\":");
sb.append(this.custom_alerts ? "true" : "false");
sb.append(",\"badgeCount\":");
sb.append(this.badge_count ? "true" : "false");
sb.append(",\"sound\":");
sb.append(this.sound ? "true" : "false");
sb.append(",\"vibration\":");
sb.append(this.vibration ? "true" : "false");
sb.append("}");
return sb.str;
}
/**
* Deserialize from JSON string
*/
public static NotificationPreferences? from_json_string(string json_string) {
var parser = new Json.Parser();
try {
if (!parser.load_from_data(json_string)) {
return null;
}
} catch (Error e) {
warning("Failed to parse JSON: %s", e.message);
return null;
}
return from_json_node(parser.get_root());
}
/**
* Deserialize from Json.Node
*/
public static NotificationPreferences? from_json_node(Json.Node node) {
if (node.get_node_type() != Json.NodeType.OBJECT) {
return null;
}
var obj = node.get_object();
var prefs = new NotificationPreferences();
if (obj.has_member("newArticles")) {
prefs.new_articles = obj.get_boolean_member("newArticles");
}
if (obj.has_member("episodeReleases")) {
prefs.episode_releases = obj.get_boolean_member("episodeReleases");
}
if (obj.has_member("customAlerts")) {
prefs.custom_alerts = obj.get_boolean_member("customAlerts");
}
if (obj.has_member("badgeCount")) {
prefs.badge_count = obj.get_boolean_member("badgeCount");
}
if (obj.has_member("sound")) {
prefs.sound = obj.get_boolean_member("sound");
}
if (obj.has_member("vibration")) {
prefs.vibration = obj.get_boolean_member("vibration");
}
return prefs;
}
/**
* Equality comparison
*/
public bool equals(NotificationPreferences? other) {
if (other == null) {
return false;
}
return this.new_articles == other.new_articles &&
this.episode_releases == other.episode_releases &&
this.custom_alerts == other.custom_alerts &&
this.badge_count == other.badge_count &&
this.sound == other.sound &&
this.vibration == other.vibration;
}
/**
* Copy preferences from another instance
*/
public void copy_from(NotificationPreferences other) {
this.new_articles = other.new_articles;
this.episode_releases = other.episode_releases;
this.custom_alerts = other.custom_alerts;
this.badge_count = other.badge_count;
this.sound = other.sound;
this.vibration = other.vibration;
}
}

View File

@@ -1,168 +0,0 @@
/*
* ReadingPreferences.vala
*
* Represents user reading/display preferences.
* Following GNOME HIG naming conventions and Vala/GObject patterns.
*/
/**
* FontSize - Available font size options
*/
public enum RSSuper.FontSize {
SMALL,
MEDIUM,
LARGE,
XLARGE
}
/**
* LineHeight - Available line height options
*/
public enum RSSuper.LineHeight {
NORMAL,
RELAXED,
LOOSE
}
/**
* ReadingPreferences - User reading/display settings
*/
public class RSSuper.ReadingPreferences : Object {
public FontSize font_size { get; set; }
public LineHeight line_height { get; set; }
public bool show_table_of_contents { get; set; }
public bool show_reading_time { get; set; }
public bool show_author { get; set; }
public bool show_date { get; set; }
public ReadingPreferences() {
this.font_size = FontSize.MEDIUM;
this.line_height = LineHeight.NORMAL;
this.show_table_of_contents = true;
this.show_reading_time = true;
this.show_author = true;
this.show_date = true;
}
public ReadingPreferences.with_values(FontSize font_size = FontSize.MEDIUM,
LineHeight line_height = LineHeight.NORMAL,
bool show_table_of_contents = true,
bool show_reading_time = true,
bool show_author = true,
bool show_date = true) {
this.font_size = font_size;
this.line_height = line_height;
this.show_table_of_contents = show_table_of_contents;
this.show_reading_time = show_reading_time;
this.show_author = show_author;
this.show_date = show_date;
}
public string get_font_size_string() {
switch (this.font_size) {
case FontSize.SMALL: return "small";
case FontSize.MEDIUM: return "medium";
case FontSize.LARGE: return "large";
case FontSize.XLARGE: return "xlarge";
default: return "medium";
}
}
public static FontSize font_size_from_string(string str) {
switch (str) {
case "small": return FontSize.SMALL;
case "medium": return FontSize.MEDIUM;
case "large": return FontSize.LARGE;
case "xlarge": return FontSize.XLARGE;
default: return FontSize.MEDIUM;
}
}
public string get_line_height_string() {
switch (this.line_height) {
case LineHeight.NORMAL: return "normal";
case LineHeight.RELAXED: return "relaxed";
case LineHeight.LOOSE: return "loose";
default: return "normal";
}
}
public static LineHeight line_height_from_string(string str) {
switch (str) {
case "normal": return LineHeight.NORMAL;
case "relaxed": return LineHeight.RELAXED;
case "loose": return LineHeight.LOOSE;
default: return LineHeight.NORMAL;
}
}
public void reset_to_defaults() {
this.font_size = FontSize.MEDIUM;
this.line_height = LineHeight.NORMAL;
this.show_table_of_contents = true;
this.show_reading_time = true;
this.show_author = true;
this.show_date = true;
}
public string to_json_string() {
var sb = new StringBuilder();
sb.append("{\"fontSize\":\"");
sb.append(this.get_font_size_string());
sb.append("\",\"lineHeight\":\"");
sb.append(this.get_line_height_string());
sb.append("\",\"showTableOfContents\":");
sb.append(this.show_table_of_contents ? "true" : "false");
sb.append(",\"showReadingTime\":");
sb.append(this.show_reading_time ? "true" : "false");
sb.append(",\"showAuthor\":");
sb.append(this.show_author ? "true" : "false");
sb.append(",\"showDate\":");
sb.append(this.show_date ? "true" : "false");
sb.append("}");
return sb.str;
}
public static ReadingPreferences? from_json_string(string json_string) {
var parser = new Json.Parser();
try {
if (!parser.load_from_data(json_string)) return null;
} catch (Error e) {
warning("Failed to parse JSON: %s", e.message);
return null;
}
return from_json_node(parser.get_root());
}
public static ReadingPreferences? from_json_node(Json.Node node) {
if (node.get_node_type() != Json.NodeType.OBJECT) return null;
var obj = node.get_object();
var prefs = new ReadingPreferences();
if (obj.has_member("fontSize")) prefs.font_size = font_size_from_string(obj.get_string_member("fontSize"));
if (obj.has_member("lineHeight")) prefs.line_height = line_height_from_string(obj.get_string_member("lineHeight"));
if (obj.has_member("showTableOfContents")) prefs.show_table_of_contents = obj.get_boolean_member("showTableOfContents");
if (obj.has_member("showReadingTime")) prefs.show_reading_time = obj.get_boolean_member("showReadingTime");
if (obj.has_member("showAuthor")) prefs.show_author = obj.get_boolean_member("showAuthor");
if (obj.has_member("showDate")) prefs.show_date = obj.get_boolean_member("showDate");
return prefs;
}
public bool equals(ReadingPreferences? other) {
if (other == null) return false;
return this.font_size == other.font_size &&
this.line_height == other.line_height &&
this.show_table_of_contents == other.show_table_of_contents &&
this.show_reading_time == other.show_reading_time &&
this.show_author == other.show_author &&
this.show_date == other.show_date;
}
public void copy_from(ReadingPreferences other) {
this.font_size = other.font_size;
this.line_height = other.line_height;
this.show_table_of_contents = other.show_table_of_contents;
this.show_reading_time = other.show_reading_time;
this.show_author = other.show_author;
this.show_date = other.show_date;
}
}

View File

@@ -1,435 +0,0 @@
/*
* SearchFilters.vala
*
* Represents search query parameters and filters.
* Following GNOME HIG naming conventions and Vala/GObject patterns.
*/
/**
* SearchContentType - Type of content to search for
*/
public enum RSSuper.SearchContentType {
ARTICLE,
AUDIO,
VIDEO
}
/**
* SearchSortOption - Sorting options for search results
*/
public enum RSSuper.SearchSortOption {
RELEVANCE,
DATE_DESC,
DATE_ASC,
TITLE_ASC,
TITLE_DESC,
FEED_ASC,
FEED_DESC
}
/**
* SearchFilters - Represents search filters and query parameters
*/
public struct RSSuper.SearchFilters {
public string? date_from { get; set; }
public string? date_to { get; set; }
public string[] feed_ids { get; set; }
public string[] authors { get; set; }
public SearchContentType? content_type { get; set; }
/**
* Default constructor
*/
public SearchFilters(string? date_from = null, string? date_to = null,
string[]? feed_ids = null, string[]? authors = null,
SearchContentType? content_type = null) {
this.date_from = date_from;
this.date_to = date_to;
this.feed_ids = feed_ids;
this.authors = authors;
this.content_type = content_type;
}
/**
* Get content type as string
*/
public string? get_content_type_string() {
if (this.content_type == null) {
return null;
}
switch (this.content_type) {
case SearchContentType.ARTICLE:
return "article";
case SearchContentType.AUDIO:
return "audio";
case SearchContentType.VIDEO:
return "video";
default:
return null;
}
}
/**
* Parse content type from string
*/
public static SearchContentType? content_type_from_string(string? str) {
if (str == null) {
return null;
}
switch (str) {
case "article":
return SearchContentType.ARTICLE;
case "audio":
return SearchContentType.AUDIO;
case "video":
return SearchContentType.VIDEO;
default:
return null;
}
}
/**
* Get sort option as string
*/
public static string sort_option_to_string(SearchSortOption option) {
switch (option) {
case SearchSortOption.RELEVANCE:
return "relevance";
case SearchSortOption.DATE_DESC:
return "date_desc";
case SearchSortOption.DATE_ASC:
return "date_asc";
case SearchSortOption.TITLE_ASC:
return "title_asc";
case SearchSortOption.TITLE_DESC:
return "title_desc";
case SearchSortOption.FEED_ASC:
return "feed_asc";
case SearchSortOption.FEED_DESC:
return "feed_desc";
default:
return "relevance";
}
}
/**
* Parse sort option from string
*/
public static SearchSortOption sort_option_from_string(string str) {
switch (str) {
case "relevance":
return SearchSortOption.RELEVANCE;
case "date_desc":
return SearchSortOption.DATE_DESC;
case "date_asc":
return SearchSortOption.DATE_ASC;
case "title_asc":
return SearchSortOption.TITLE_ASC;
case "title_desc":
return SearchSortOption.TITLE_DESC;
case "feed_asc":
return SearchSortOption.FEED_ASC;
case "feed_desc":
return SearchSortOption.FEED_DESC;
default:
return SearchSortOption.RELEVANCE;
}
}
/**
* Check if any filters are set
*/
public bool has_filters() {
return this.date_from != null ||
this.date_to != null ||
(this.feed_ids != null && this.feed_ids.length > 0) ||
(this.authors != null && this.authors.length > 0) ||
this.content_type != null;
}
/**
* Serialize to JSON string
*/
public string to_json_string() {
var sb = new StringBuilder();
sb.append("{");
var first = true;
if (this.date_from != null) {
sb.append("\"dateFrom\":\"");
sb.append(this.date_from);
sb.append("\"");
first = false;
}
if (this.date_to != null) {
if (!first) sb.append(",");
sb.append("\"dateTo\":\"");
sb.append(this.date_to);
sb.append("\"");
first = false;
}
if (this.feed_ids != null && this.feed_ids.length > 0) {
if (!first) sb.append(",");
sb.append("\"feedIds\":[");
for (var i = 0; i < this.feed_ids.length; i++) {
if (i > 0) sb.append(",");
sb.append("\"");
sb.append(this.feed_ids[i]);
sb.append("\"");
}
sb.append("]");
first = false;
}
if (this.authors != null && this.authors.length > 0) {
if (!first) sb.append(",");
sb.append("\"authors\":[");
for (var i = 0; i < this.authors.length; i++) {
if (i > 0) sb.append(",");
sb.append("\"");
sb.append(this.authors[i]);
sb.append("\"");
}
sb.append("]");
first = false;
}
if (this.content_type != null) {
if (!first) sb.append(",");
sb.append("\"contentType\":\"");
sb.append(this.get_content_type_string());
sb.append("\"");
}
sb.append("}");
return sb.str;
}
/**
* Deserialize from JSON string
*/
public static SearchFilters? from_json_string(string json_string) {
var parser = new Json.Parser();
try {
if (!parser.load_from_data(json_string)) {
return null;
}
} catch (Error e) {
warning("Failed to parse JSON: %s", e.message);
return null;
}
return from_json_node(parser.get_root());
}
/**
* Deserialize from Json.Node
*/
public static SearchFilters? from_json_node(Json.Node node) {
if (node.get_node_type() != Json.NodeType.OBJECT) {
return null;
}
var obj = node.get_object();
var filters = SearchFilters();
if (obj.has_member("dateFrom")) {
filters.date_from = obj.get_string_member("dateFrom");
}
if (obj.has_member("dateTo")) {
filters.date_to = obj.get_string_member("dateTo");
}
if (obj.has_member("feedIds")) {
var array = obj.get_array_member("feedIds");
var feed_ids = new string[array.get_length()];
for (var i = 0; i < array.get_length(); i++) {
feed_ids[i] = array.get_string_element(i);
}
filters.feed_ids = feed_ids;
}
if (obj.has_member("authors")) {
var array = obj.get_array_member("authors");
var authors = new string[array.get_length()];
for (var i = 0; i < array.get_length(); i++) {
authors[i] = array.get_string_element(i);
}
filters.authors = authors;
}
if (obj.has_member("contentType")) {
filters.content_type = content_type_from_string(obj.get_string_member("contentType"));
}
return filters;
}
/**
* Equality comparison
*/
public bool equals(SearchFilters other) {
return this.date_from == other.date_from &&
this.date_to == other.date_to &&
this.feeds_equal(other.feed_ids) &&
this.authors_equal(other.authors) &&
this.content_type == other.content_type;
}
/**
* Helper for feed_ids comparison
*/
private bool feeds_equal(string[]? other) {
if (this.feed_ids == null && other == null) return true;
if (this.feed_ids == null || other == null) return false;
if (this.feed_ids.length != other.length) {
return false;
}
for (var i = 0; i < this.feed_ids.length; i++) {
if (this.feed_ids[i] != other[i]) {
return false;
}
}
return true;
}
/**
* Helper for authors comparison
*/
private bool authors_equal(string[]? other) {
if (this.authors == null && other == null) return true;
if (this.authors == null || other == null) return false;
if (this.authors.length != other.length) {
return false;
}
for (var i = 0; i < this.authors.length; i++) {
if (this.authors[i] != other[i]) {
return false;
}
}
return true;
}
}
/**
* SearchQuery - Represents a complete search query
*/
public struct RSSuper.SearchQuery {
public string query { get; set; }
public int page { get; set; }
public int page_size { get; set; }
public string filters_json { get; set; }
public SearchSortOption sort { get; set; }
/**
* Default constructor
*/
public SearchQuery(string query, int page = 1, int page_size = 20,
string? filters_json = null, SearchSortOption sort = SearchSortOption.RELEVANCE) {
this.query = query;
this.page = page;
this.page_size = page_size;
this.filters_json = filters_json;
this.sort = sort;
}
/**
* Get filters as struct
*/
public SearchFilters? get_filters() {
if (this.filters_json == null || this.filters_json.length == 0) {
return null;
}
return SearchFilters.from_json_string(this.filters_json);
}
/**
* Set filters from struct
*/
public void set_filters(SearchFilters? filters) {
if (filters == null) {
this.filters_json = "";
} else {
this.filters_json = filters.to_json_string();
}
}
/**
* Serialize to JSON string
*/
public string to_json_string() {
var sb = new StringBuilder();
sb.append("{");
sb.append("\"query\":\"");
sb.append(this.query);
sb.append("\"");
sb.append(",\"page\":%d".printf(this.page));
sb.append(",\"pageSize\":%d".printf(this.page_size));
if (this.filters_json != null && this.filters_json.length > 0) {
sb.append(",\"filters\":");
sb.append(this.filters_json);
}
sb.append(",\"sort\":\"");
sb.append(SearchFilters.sort_option_to_string(this.sort));
sb.append("\"");
sb.append("}");
return sb.str;
}
/**
* Deserialize from JSON string
*/
public static SearchQuery? from_json_string(string json_string) {
var parser = new Json.Parser();
try {
if (!parser.load_from_data(json_string)) {
return null;
}
} catch (Error e) {
warning("Failed to parse JSON: %s", e.message);
return null;
}
return from_json_node(parser.get_root());
}
/**
* Deserialize from Json.Node
*/
public static SearchQuery? from_json_node(Json.Node node) {
if (node.get_node_type() != Json.NodeType.OBJECT) {
return null;
}
var obj = node.get_object();
if (!obj.has_member("query")) {
return null;
}
var query = SearchQuery(obj.get_string_member("query"));
if (obj.has_member("page")) {
query.page = (int)obj.get_int_member("page");
}
if (obj.has_member("pageSize")) {
query.page_size = (int)obj.get_int_member("pageSize");
}
if (obj.has_member("filters")) {
var generator = new Json.Generator();
generator.set_root(obj.get_member("filters"));
query.filters_json = generator.to_data(null);
}
if (obj.has_member("sort")) {
query.sort = SearchFilters.sort_option_from_string(obj.get_string_member("sort"));
}
return query;
}
/**
* Equality comparison
*/
public bool equals(SearchQuery other) {
return this.query == other.query &&
this.page == other.page &&
this.page_size == other.page_size &&
this.filters_json == other.filters_json &&
this.sort == other.sort;
}
}

View File

@@ -1,208 +0,0 @@
/*
* SearchResult.vala
*
* Represents a search result item from the feed database.
* Following GNOME HIG naming conventions and Vala/GObject patterns.
*/
/**
* SearchResultType - Type of search result
*/
public enum RSSuper.SearchResultType {
ARTICLE,
FEED
}
/**
* SearchResult - Represents a single search result
*/
public class RSSuper.SearchResult : Object {
public string id { get; set; }
public SearchResultType result_type { get; set; }
public string title { get; set; }
public string? snippet { get; set; }
public string? link { get; set; }
public string? feed_title { get; set; }
public string? published { get; set; }
public double score { get; set; }
/**
* Default constructor
*/
public SearchResult() {
this.id = "";
this.result_type = SearchResultType.ARTICLE;
this.title = "";
this.score = 0.0;
}
/**
* Constructor with initial values
*/
public SearchResult.with_values(string id, SearchResultType type, string title,
string? snippet = null, string? link = null,
string? feed_title = null, string? published = null,
double score = 0.0) {
this.id = id;
this.result_type = type;
this.title = title;
this.snippet = snippet;
this.link = link;
this.feed_title = feed_title;
this.published = published;
this.score = score;
}
/**
* Get type as string
*/
public string get_type_string() {
switch (this.result_type) {
case SearchResultType.ARTICLE:
return "article";
case SearchResultType.FEED:
return "feed";
default:
return "unknown";
}
}
/**
* Parse type from string
*/
public static SearchResultType type_from_string(string str) {
switch (str) {
case "article":
return SearchResultType.ARTICLE;
case "feed":
return SearchResultType.FEED;
default:
return SearchResultType.ARTICLE;
}
}
/**
* Serialize to JSON string
*/
public string to_json_string() {
var sb = new StringBuilder();
sb.append("{");
sb.append("\"id\":\"");
sb.append(this.id);
sb.append("\",\"type\":\"");
sb.append(this.get_type_string());
sb.append("\",\"title\":\"");
sb.append(this.title);
sb.append("\"");
if (this.snippet != null) {
sb.append(",\"snippet\":\"");
sb.append(this.snippet);
sb.append("\"");
}
if (this.link != null) {
sb.append(",\"link\":\"");
sb.append(this.link);
sb.append("\"");
}
if (this.feed_title != null) {
sb.append(",\"feedTitle\":\"");
sb.append(this.feed_title);
sb.append("\"");
}
if (this.published != null) {
sb.append(",\"published\":\"");
sb.append(this.published);
sb.append("\"");
}
if (this.score != 0.0) {
sb.append(",\"score\":%f".printf(this.score));
}
sb.append("}");
return sb.str;
}
/**
* Deserialize from JSON string
*/
public static SearchResult? from_json_string(string json_string) {
var parser = new Json.Parser();
try {
if (!parser.load_from_data(json_string)) {
return null;
}
} catch (Error e) {
warning("Failed to parse JSON: %s", e.message);
return null;
}
return from_json_node(parser.get_root());
}
/**
* Deserialize from Json.Node
*/
public static SearchResult? from_json_node(Json.Node node) {
if (node.get_node_type() != Json.NodeType.OBJECT) {
return null;
}
var obj = node.get_object();
if (!obj.has_member("id") || !obj.has_member("type") || !obj.has_member("title")) {
return null;
}
var result = new SearchResult();
result.id = obj.get_string_member("id");
result.result_type = SearchResult.type_from_string(obj.get_string_member("type"));
result.title = obj.get_string_member("title");
if (obj.has_member("snippet")) {
result.snippet = obj.get_string_member("snippet");
}
if (obj.has_member("link")) {
result.link = obj.get_string_member("link");
}
if (obj.has_member("feedTitle")) {
result.feed_title = obj.get_string_member("feedTitle");
}
if (obj.has_member("published")) {
result.published = obj.get_string_member("published");
}
if (obj.has_member("score")) {
result.score = obj.get_double_member("score");
}
return result;
}
/**
* Equality comparison
*/
public bool equals(SearchResult? other) {
if (other == null) {
return false;
}
return this.id == other.id &&
this.result_type == other.result_type &&
this.title == other.title &&
this.snippet == other.snippet &&
this.link == other.link &&
this.feed_title == other.feed_title &&
this.published == other.published &&
this.score == other.score;
}
/**
* Get a human-readable summary
*/
public string get_summary() {
if (this.feed_title != null) {
return "[%s] %s - %s".printf(this.get_type_string(), this.feed_title, this.title);
}
return "[%s] %s".printf(this.get_type_string(), this.title);
}
}

View File

@@ -1,503 +0,0 @@
/*
* FeedFetcher.vala
*
* Feed fetching service using libsoup-3.0
* Supports HTTP auth, caching, timeouts, and retry with exponential backoff.
*/
using Soup;
using GLib;
/**
* FeedFetcher - Service for fetching RSS/Atom feeds
*/
public class RSSuper.FeedFetcher : Object {
private Session session;
private int timeout_seconds;
private int max_retries;
private int base_retry_delay_ms;
private int max_content_size;
/**
* Cache for fetched feeds
* Key: feed URL, Value: cached response data
*/
private HashTable<string, CacheEntry> cache;
/**
* Default timeout in seconds
*/
public const int DEFAULT_TIMEOUT = 15;
/**
* Default maximum retries
*/
public const int DEFAULT_MAX_RETRIES = 3;
/**
* Default base retry delay in milliseconds
*/
public const int DEFAULT_BASE_RETRY_DELAY_MS = 1000;
/**
* Maximum content size (10 MB)
*/
public const int DEFAULT_MAX_CONTENT_SIZE = 10 * 1024 * 1024;
/**
* Valid content types for feeds
*/
private static string[] VALID_CONTENT_TYPES = {
"application/rss+xml",
"application/atom+xml",
"text/xml",
"text/html",
"application/xml"
};
/**
* Signal emitted when a feed is fetched
*/
public signal void feed_fetched(string url, bool success, int? error_code = null);
/**
* Signal emitted when a retry is about to happen
*/
public signal void retrying(string url, int attempt, int delay_ms);
/**
* Create a new feed fetcher
*/
public FeedFetcher(int timeout_seconds = DEFAULT_TIMEOUT,
int max_retries = DEFAULT_MAX_RETRIES,
int base_retry_delay_ms = DEFAULT_BASE_RETRY_DELAY_MS,
int max_content_size = DEFAULT_MAX_CONTENT_SIZE) {
this.timeout_seconds = timeout_seconds;
this.max_retries = max_retries;
this.base_retry_delay_ms = base_retry_delay_ms;
this.max_content_size = max_content_size;
this.cache = new HashTable<string, CacheEntry>(str_hash, str_equal);
this.session = new Session();
this.configure_session();
}
/**
* Configure the Soup session
*/
private void configure_session() {
// Set timeout
this.session.set_property("timeout", this.timeout_seconds * 1000); // Convert to ms
// Set HTTP/2
this.session.set_property("http-version", "2.0");
// Set user agent
this.session.set_property("user-agent", "RSSuper/1.0");
// Disable cookies (not needed for feed fetching)
var cookie_jar = new CookieJar();
this.session.set_property("cookie-jar", cookie_jar);
// Set TCP keepalive
this.session.set_property("tcp-keepalive", true);
this.session.set_property("tcp-keepalive-interval", 60);
}
/**
* Fetch a feed from the given URL
*
* @param url The feed URL to fetch
* @param credentials Optional HTTP auth credentials
* @return FetchResult containing the feed content or error
*/
public FetchResult fetch(string url, HttpAuthCredentials? credentials = null) throws Error {
// Validate URL
if (!is_valid_url(url)) {
return FetchResult.err("Invalid URL", 400);
}
// Check cache first
var cached_entry = this.cache.lookup(url);
if (cached_entry != null && !cached_entry.is_expired()) {
debug("Cache hit for: %s", url);
return FetchResult.ok(cached_entry.content, 200,
cached_entry.content_type,
cached_entry.etag,
cached_entry.last_modified,
true);
}
// Perform fetch with retry logic
var request = new Message(Method.GET, url);
// Add cache validation headers if we have cached data
if (cached_entry != null) {
if (cached_entry.etag != null) {
request.headers.append("If-None-Match", cached_entry.etag);
}
if (cached_entry.last_modified != null) {
request.headers.append("If-Modified-Since", cached_entry.last_modified);
}
}
// Set up HTTP auth if credentials provided
if (credentials != null && credentials.has_credentials()) {
setup_http_auth(request, credentials);
}
int attempt = 0;
int delay_ms = this.base_retry_delay_ms;
while (attempt <= this.max_retries) {
try {
if (attempt > 0) {
this.retrying.emit(url, attempt, delay_ms);
GLib.usleep((uint)(delay_ms * 1000));
}
// Send request
this.session.send_and_read(request);
// Check status code
var status = request.status_code;
if (status == 304) {
// 304 Not Modified - return cached content
debug("304 Not Modified for: %s", url);
if (cached_entry != null) {
return FetchResult.ok(cached_entry.content, 304,
cached_entry.content_type,
cached_entry.etag,
cached_entry.last_modified,
true);
}
return FetchResult.err("No cached content for 304 response", 304);
}
if (status != 200) {
return handle_http_error(status, request);
}
// Read response body
var body = request.get_response_body();
if (body == null || body.length == 0) {
return FetchResult.err("Empty response", status);
}
// Check content size
if (body.length > this.max_content_size) {
return FetchResult.err("Content too large", status);
}
// Get content type
var content_type = request.get_response_content_type();
if (!is_valid_content_type(content_type)) {
warning("Unexpected content type: %s", content_type);
}
// Convert body to string
string content;
try {
content = body.get_data_as_text();
} catch (Error e) {
warning("Failed to decode response: %s", e.message);
return FetchResult.err("Failed to decode response", status);
}
// Extract cache headers
string? etag = null;
string? last_modified = null;
try {
etag = request.headers.get_one("ETag");
last_modified = request.headers.get_one("Last-Modified");
} catch (Error e) {
warning("Failed to get cache headers: %s", e.message);
}
// Cache the response
cache_response(url, content, content_type, etag, last_modified, request);
return FetchResult.ok(content, status,
content_type,
etag,
last_modified,
false);
} catch (Error e) {
warning("Fetch error (attempt %d): %s", attempt + 1, e.message);
// Check if retryable
if (!is_retryable_error(e)) {
return FetchResult.from_error(e);
}
attempt++;
if (attempt <= this.max_retries) {
// Exponential backoff
delay_ms = this.base_retry_delay_ms * (1 << attempt);
if (delay_ms > 30000) delay_ms = 30000; // Max 30 seconds
} else {
return FetchResult.from_error(e);
}
}
}
return FetchResult.err("Max retries exceeded", 0);
}
/**
* Fetch multiple feeds concurrently
*/
public FetchResult[] fetch_many(string[] urls, HttpAuthCredentials[]? credentials = null) throws Error {
var results = new FetchResult[urls.length];
for (int i = 0; i < urls.length; i++) {
var cred = (credentials != null && i < credentials.length) ? credentials[i] : null;
results[i] = this.fetch(urls[i], cred);
}
return results;
}
/**
* Set up HTTP authentication on a request
*/
private void setup_http_auth(Message request, HttpAuthCredentials credentials) {
if (credentials.username == null || credentials.username.length == 0) {
return;
}
// Create auth header
string auth_value;
if (credentials.password != null) {
auth_value = "%s:%s".printf(credentials.username, credentials.password);
} else {
auth_value = credentials.username;
}
var encoded = Base64.encode((uint8[])auth_value);
request.headers.append("Authorization", "Basic %s".printf((string)encoded));
}
/**
* Handle HTTP error status codes
*/
private FetchResult handle_http_error(int status, Message request) {
switch (status) {
case 404:
return FetchResult.err("Feed not found", 404);
case 403:
return FetchResult.err("Access forbidden", 403);
case 401:
return FetchResult.err("Unauthorized", 401);
case 400:
return FetchResult.err("Bad request", 400);
case 500:
case 502:
case 503:
case 504:
return FetchResult.err("Server error", status);
default:
if (status >= 400) {
return FetchResult.err("Client error", status);
}
return FetchResult.err("Request failed", status);
}
}
/**
* Cache a response
*/
private void cache_response(string url, string content, string? content_type,
string? etag, string? last_modified, Message request) {
// Parse Cache-Control header
string? cache_control = null;
try {
cache_control = request.headers.get_one("Cache-Control");
} catch (Error e) {
warning("Failed to get Cache-Control header: %s", e.message);
}
int max_age = 60; // Default 60 seconds
if (cache_control != null) {
max_age = parse_cache_control(cache_control);
}
var entry = new CacheEntry();
entry.content = content;
entry.content_type = content_type;
entry.etag = etag;
entry.last_modified = last_modified;
entry.fetched_at = DateTime.new_now_local();
entry.max_age_seconds = max_age;
this.cache.insert(url, entry);
// Limit cache size
if (this.cache.get_size() > 100) {
// Remove oldest entry
var oldest_key = find_oldest_cache_entry();
if (oldest_key != null) {
this.cache.remove(oldest_key);
}
}
}
/**
* Parse Cache-Control header for max-age
*/
private int parse_cache_control(string cache_control) {
var parts = cache_control.split(",");
foreach (var part in parts) {
var trimmed = part.strip();
if (trimmed.has_prefix("max-age=")) {
var value_str = trimmed.substring(8).strip();
int? max_age = int.try_parse(value_str);
if (max_age.HasValue && max_age.Value > 0) {
return min(max_age.Value, 3600); // Cap at 1 hour
}
}
}
return 60; // Default
}
/**
* Find the oldest cache entry key
*/
private string? find_oldest_cache_entry() {
string? oldest_key = null;
DateTime? oldest_time = null;
foreach (var key in this.cache.get_keys()) {
var entry = this.cache.lookup(key);
if (entry != null) {
if (oldest_time == null || entry.fetched_at.compare(oldest_time) < 0) {
oldest_time = entry.fetched_at;
oldest_key = key;
}
}
}
return oldest_key;
}
/**
* Check if a URL is valid
*/
private bool is_valid_url(string url) {
try {
var uri = new Soup.Uri(url);
var scheme = uri.get_scheme();
return scheme == "http" || scheme == "https";
} catch (Error e) {
return false;
}
}
/**
* Check if content type is valid for feeds
*/
private bool is_valid_content_type(string? content_type) {
if (content_type == null) {
return true; // Allow unknown content types
}
foreach (var valid_type in VALID_CONTENT_TYPES) {
if (content_type.contains(valid_type)) {
return true;
}
}
return true; // Be permissive
}
/**
* Check if an error is retryable
*/
private bool is_retryable_error(Error error) {
if (error is NetworkError) {
var net_error = error as NetworkError;
switch ((int)net_error) {
case (int)NetworkError.TIMEOUT:
case (int)NetworkError.CONNECTION_FAILED:
case (int)NetworkError.SERVER_ERROR:
case (int)NetworkError.EMPTY_RESPONSE:
return true;
default:
return false;
}
}
return false;
}
/**
* Clear the cache
*/
public void clear_cache() {
this.cache.remove_all();
}
/**
* Get cache statistics
*/
public int get_cache_size() {
return this.cache.get_size();
}
/**
* Set timeout
*/
public void set_timeout(int seconds) {
this.timeout_seconds = seconds;
this.session.set_property("timeout", seconds * 1000);
}
/**
* Get timeout
*/
public int get_timeout() {
return this.timeout_seconds;
}
/**
* Set maximum retries
*/
public void set_max_retries(int retries) {
this.max_retries = retries;
}
/**
* Get maximum retries
*/
public int get_max_retries() {
return this.max_retries;
}
}
/**
* CacheEntry - Cached feed response
*/
private class CacheEntry : Object {
public string content { get; set; }
public string? content_type { get; set; }
public string? etag { get; set; }
public string? last_modified { get; set; }
public DateTime fetched_at { get; set; }
public int max_age_seconds { get; set; }
public CacheEntry() {
this.content = "";
this.max_age_seconds = 60;
}
/**
* Check if cache entry is expired
*/
public bool is_expired() {
var now = DateTime.new_now_local();
var elapsed = now.unix_timestamp() - this.fetched_at.unix_timestamp();
return elapsed > this.max_age_seconds;
}
}

View File

@@ -1,137 +0,0 @@
/*
* FetchResult.vala
*
* Result type for feed fetch operations.
*/
/**
* FetchResult - Result of a feed fetch operation
*/
public class RSSuper.FetchResult : Object {
private bool is_success;
private string? content;
private string? error_message;
private int http_status_code;
private string? content_type;
private string? etag;
private string? last_modified;
private bool from_cache;
/**
* Check if the fetch was successful
*/
public bool successful {
get { return this.is_success; }
}
/**
* Get the fetched content
*/
public string? fetched_content {
get { return this.content; }
}
/**
* Get the error message if fetch failed
*/
public string? error {
get { return this.error_message; }
}
/**
* Get the HTTP status code
*/
public int status_code {
get { return this.http_status_code; }
}
/**
* Get the content type
*/
public string? response_content_type {
get { return this.content_type; }
}
/**
* Get the ETag header value
*/
public string? response_etag {
get { return this.etag; }
}
/**
* Get the Last-Modified header value
*/
public string? response_last_modified {
get { return this.last_modified; }
}
/**
* Check if response was from cache
*/
public bool is_from_cache {
get { return this.from_cache; }
}
/**
* Create a successful fetch result
*/
public static FetchResult ok(string content, int status_code = 200,
string? content_type = null, string? etag = null,
string? last_modified = null, bool from_cache = false) {
var result = new FetchResult();
result.is_success = true;
result.content = content;
result.http_status_code = status_code;
result.content_type = content_type;
result.etag = etag;
result.last_modified = last_modified;
result.from_cache = from_cache;
return result;
}
/**
* Create a failed fetch result
*/
public static FetchResult err(string error_message, int status_code = 0) {
var result = new FetchResult();
result.is_success = false;
result.error_message = error_message;
result.http_status_code = status_code;
return result;
}
/**
* Create a failed fetch result from NetworkError
*/
public static FetchResult from_error(Error error) {
if (error is NetworkError) {
var net_error = error as NetworkError;
return FetchResult.err(net_error.message, get_status_code_from_error(net_error));
}
return FetchResult.err(error.message);
}
/**
* Helper to get HTTP status code from error
*/
private static int get_status_code_from_error(NetworkError error) {
switch ((int)error) {
case (int)NetworkError.NOT_FOUND:
return 404;
case (int)NetworkError.FORBIDDEN:
return 403;
case (int)NetworkError.UNAUTHORIZED:
return 401;
case (int)NetworkError.BAD_REQUEST:
return 400;
case (int)NetworkError.SERVER_ERROR:
return 500;
case (int)NetworkError.PROTOCOL_ERROR:
case (int)NetworkError.SSL_ERROR:
return 502;
default:
return 0;
}
}
}

View File

@@ -1,63 +0,0 @@
/*
* HttpAuthCredentials.vala
*
* HTTP authentication credentials for feed subscriptions.
*/
/**
* HttpAuthCredentials - HTTP authentication credentials
*/
public class RSSuper.HttpAuthCredentials : Object {
/**
* Username for HTTP authentication
*/
public string? username { get; set; }
/**
* Password for HTTP authentication
*/
public string? password { get; set; }
/**
* Default constructor
*/
public HttpAuthCredentials() {
this.username = null;
this.password = null;
}
/**
* Constructor with credentials
*/
public HttpAuthCredentials.with_credentials(string? username = null, string? password = null) {
this.username = username;
this.password = password;
}
/**
* Check if credentials are set
*/
public bool has_credentials() {
return this.username != null && this.username.length > 0;
}
/**
* Clear credentials
*/
public void clear() {
this.username = null;
this.password = null;
}
/**
* Equality comparison
*/
public bool equals(HttpAuthCredentials? other) {
if (other == null) {
return false;
}
return this.username == other.username &&
this.password == other.password;
}
}

View File

@@ -1,29 +0,0 @@
/*
* NetworkError.vala
*
* Network error domain for feed fetcher service.
*/
namespace RSSuper {
/**
* NetworkError - Error domain for network operations
*/
public errordomain NetworkError {
TIMEOUT, /** Request timed out */
NOT_FOUND, /** Resource not found (404) */
FORBIDDEN, /** Access forbidden (403) */
UNAUTHORIZED, /** Unauthorized (401) */
BAD_REQUEST, /** Bad request (400) */
SERVER_ERROR, /** Server error (5xx) */
CLIENT_ERROR, /** Client error (4xx, generic) */
DNS_FAILED, /** DNS resolution failed */
CONNECTION_FAILED, /** Connection failed */
PROTOCOL_ERROR, /** Protocol error */
SSL_ERROR, /** SSL/TLS error */
CANCELLED, /** Request was cancelled */
EMPTY_RESPONSE, /** Empty response received */
INVALID_URL, /** Invalid URL */
CONTENT_TOO_LARGE, /** Content exceeds size limit */
INVALID_CONTENT_TYPE, /** Invalid content type */
}
}

Some files were not shown because too many files have changed in this diff Show More