This report proposes a mathematically rigorous system for converting discrete calendar events into continuous frequency-domain representations, enabling "beat frequency" visualization of temporal patterns. The core insight is that calendar data can be treated analogously to neural spike trains—discrete point events that can be convolved with kernels to produce continuous intensity functions, then decomposed via spectral analysis to discover natural rhythms. The system combines kernel density estimation for signal construction, FFT/wavelet analysis for frequency discovery, queuing theory for capacity modeling, and chronobiology-inspired rhythm hierarchies for meaningful interpretation.
Current prototype ASSIGNS frequencies arbitrarily (weekly = 7-day sine wave). This ignores actual event patterns and creates meaningless visualizations.
DISCOVER frequencies from actual calendar data using signal processing techniques. Transform: Calendar Events -> Continuous Signal -> Frequency Spectrum -> Meaningful Insights
Chronobiology provides the conceptual framework for understanding human temporal patterns. Three rhythm classes exist:
| Type | Period | Calendar Analog |
|---|---|---|
| Ultradian | <24h | Meeting blocks, focus sessions, breaks |
| Circadian | ~24h | Daily work pattern, sleep/wake |
| Infradian | >24h | Weekly sprints, monthly reviews, quarterly planning |
These rhythms interact hierarchically—circadian clocks regulate ultradian oscillations, and infradian cycles modulate circadian patterns.
Each calendar event becomes a structured object:
{
id: string,
start: timestamp,
end: timestamp,
duration: minutes,
type: EventType,
energy_cost: 0-1, // cognitive load
flexibility: 0-1, // can it move?
recurrence: Pattern | null
}
Different events have different "shapes":
| Type | Shape |
|---|---|
| Meeting | Rectangular pulse |
| Deep Work | Trapezoidal (ramp up/down) |
| Deadline | Exponential buildup |
| Admin | Low-amplitude uniform |
| Travel | Fixed buffer (shadow) |
Events have temporal "shadows"—implicit time costs:
Shadow duration scales with event energy cost and type.
Calendar events are analogous to neural spike trains—discrete point events in time. We use kernel density estimation (KDE) to convert them into continuous intensity functions.
Where:
| Kernel | Use Case | Properties |
|---|---|---|
| Rectangular | Meetings, fixed blocks | Sharp boundaries, no spillover |
| Gaussian | Flexible events, deadlines | Smooth, captures uncertainty |
| Trapezoidal | Deep work sessions | Ramp-up and ramp-down |
| Exponential | Deadlines | Asymmetric, builds pressure |
For events with duration (not point events), we use a convolution of the kernel with a rectangular pulse:
The amplitude represents "energy cost per unit time." A 3-hour meeting with energy_cost=0.8 contributes $0.8$ to the load intensity for its entire duration, plus shadow contributions before and after.
Y = hours committed per unit time (density). Simple but ignores cognitive variation.
Max theoretical = 1.0 (100% scheduled)
Y = weighted cognitive demand. Accounts for event intensity and type.
Allows "overload" > 1.0 for high-demand periods
Y = available capacity after scheduled load.
Where $C_{\max}(t)$ varies by time of day (circadian rhythm)
Apply FFT to the continuous load signal to extract frequency components:
Peaks in $|\hat{\lambda}(f)|$ reveal dominant periodicities. Expected discoveries:
For non-stationary patterns (rhythms that change over time), use wavelet analysis:
Provides time-frequency localization. Can detect:
Autocorrelation reveals periodicities even in noisy data by measuring signal similarity at different time lags:
Peaks in $R(\tau)$ indicate cycle lengths. Useful for discovering patterns you didn't know existed (e.g., "your stress peaks every 11 days").
When two periodic patterns with frequencies $f_1$ and $f_2$ combine:
For calendar rhythms, this predicts "alignment moments"—when multiple cycles constructively interfere:
Constructive interference = overload. Destructive = cancellation (rare in calendars since events don't "cancel" each other).
Pure superposition: $\lambda_{\text{total}}(t) = \sum_i \lambda_i(t)$
But humans have capacity limits. Use a saturation function:
Or apply queuing theory: when load approaches capacity, "queue depth" (backlog) grows nonlinearly. Response time degrades according to:
where $\rho = \lambda / \mu$ (utilization). At 90% utilization, wait time explodes.
Average items in system = arrival rate x time in system. For calendars: Backlog = Task arrival rate x Time to complete.
Load divided by capacity. $\rho > 1$ means unsustainable—queue grows without bound.
Service time scaled by utilization factor. Explains why everything feels harder when overloaded.
capacity_model = {
// Daily capacity varies by circadian rhythm
base_capacity: 8, // hours of productive work
// Circadian multiplier (0-1) by hour of day
circadian_curve: [0.2, 0.1, 0.05, ...], // midnight-6am low
// Cognitive load degradation over time
fatigue_factor: t => 1 - 0.02 * hours_worked(t),
// Recovery rate during breaks
recovery_rate: 0.15 // per 15-min break
}
"All your major cycles align in 3 weeks."
Algorithm: Find LCM of detected periods, predict next alignment. Score by combined amplitude.
"Your weekly load has increased 15% over past month."
Algorithm: Linear regression on weekly load averages. Alert on significant slopes.
"You have a hidden 14-day stress cycle."
Algorithm: Autocorrelation peaks + spectral analysis. Surface unexpected periodicities.
"18 hours of slack before overload this week."
Algorithm: Integrate $(C_{\max}(t) - \lambda(t))$ over forecast window.
"Weekly pattern regularity: 72% (high variance)."
Algorithm: Measure phase consistency of weekly signal. High variance = irregular schedule.
"Utilization has exceeded 85% for 3 consecutive weeks."
Algorithm: Track rolling utilization. Alert when sustained high load detected.
// STEP 1: Ingest calendar events
function ingestCalendar(events: CalendarEvent[]): LoadSignal {
const resolution = 15 * 60 * 1000; // 15-minute bins
const signal = new Float32Array(timeRange / resolution);
for (const event of events) {
const weight = computeWeight(event);
const kernel = selectKernel(event.type);
const shadows = computeShadows(event);
// Add main event
applyKernel(signal, event.start, event.end, weight, kernel);
// Add shadows
for (const shadow of shadows) {
applyKernel(signal, shadow.start, shadow.end, shadow.weight, kernel);
}
}
return signal;
}
// STEP 2: Frequency analysis
function analyzeFrequencies(signal: LoadSignal): FrequencySpectrum {
// Apply FFT
const fft = new FFT(signal.length);
const spectrum = fft.forward(signal);
// Find peaks
const peaks = findPeaks(spectrum, {
minHeight: 0.1,
minDistance: 1 // at least 1 day apart
});
// Identify known rhythms
const rhythms = peaks.map(peak => ({
frequency: peak.frequency,
amplitude: peak.amplitude,
period: 1 / peak.frequency,
type: classifyRhythm(1 / peak.frequency) // daily, weekly, etc.
}));
return { raw: spectrum, rhythms };
}
// STEP 3: Compute interference
function predictInterference(rhythms: Rhythm[]): BeatPrediction[] {
const beats = [];
for (let i = 0; i < rhythms.length; i++) {
for (let j = i + 1; j < rhythms.length; j++) {
const beatFreq = Math.abs(rhythms[i].frequency - rhythms[j].frequency);
const beatPeriod = 1 / beatFreq;
const amplitude = rhythms[i].amplitude * rhythms[j].amplitude;
beats.push({
sources: [rhythms[i], rhythms[j]],
period: beatPeriod,
nextPeak: findNextPeak(beatPeriod),
severity: amplitude
});
}
}
return beats.sort((a, b) => b.severity - a.severity);
}
// STEP 4: Capacity analysis
function analyzeCapacity(signal: LoadSignal, capacity: CapacityModel): CapacityReport {
const utilization = [];
const slack = [];
for (let t = 0; t < signal.length; t++) {
const cap = capacity.atTime(t);
const load = signal[t];
utilization.push(load / cap);
slack.push(Math.max(0, cap - load));
}
return {
avgUtilization: mean(utilization),
peakUtilization: max(utilization),
totalSlack: sum(slack),
burnoutRisk: utilization.filter(u => u > 0.85).length / utilization.length
};
}
| Edge Case | Problem | Solution |
|---|---|---|
| Recurring events with exceptions | Base pattern vs actual | Expand recurrence, apply exceptions, then analyze |
| All-day events | Duration ambiguous | Classify as "background load" with low weight, spread over working hours |
| Tentative vs confirmed | Uncertainty in schedule | Weight tentative at 0.5x; use probabilistic model |
| Travel time | Implicit time cost | Add shadow events based on location delta or explicit travel blocks |
| Energy vs time | 1hr meeting != 1hr deep work | Energy multipliers by event type (meetings = 1.2x, admin = 0.6x) |
| Context switching | Fragmentation cost | Add 10-min shadow between different-type events |
| Empty calendar | No signal to analyze | Return baseline rhythms from historical data or defaults |
| Sparse data | FFT needs density | Use wavelet analysis; interpolate gaps |
X-axis: Time (hours/days/weeks). Y-axis: Load intensity. Show:
X-axis: Period (log scale). Y-axis: Amplitude. Show:
The original "beat frequency" concept as a 3D surface: X = time, Y = frequency component, Z = amplitude. Interference patterns appear as ridge lines where multiple frequencies constructively combine. This is essentially a spectrogram or scalogram visualization.
RescueTime: Automatic tracking, classifies apps as productive/distracting. No frequency analysis.
Toggl: Manual tracking, project-based. Good for billing, not pattern discovery.
Microsoft Viva: Calendar analytics for burnout detection. Uses simple heuristics (overtime, meeting hours).
Chronobiology: Mathematical models of circadian rhythms using ODEs and coupled oscillators.
Spike Train Analysis: Kernel density estimation for neural firing rates—directly applicable methodology.
Queuing Theory: Well-established capacity planning frameworks from operations research.
FFT: O(n log n) frequency decomposition. Standard in DSP libraries.
Wavelets: Time-frequency localization. Good for non-stationary calendar patterns.
Autocorrelation: Hidden periodicity detection. Robust to noise.