No Scores Yet
Relative Brier Score
20
Forecasts
18
Upvotes
Forecasting Calendar
| Past Week | Past Month | Past Year | This Season | All Time | |
|---|---|---|---|---|---|
| Forecasts | 20 | 20 | 262 | 20 | 844 |
| Comments | 25 | 25 | 293 | 25 | 464 |
| Questions Forecasted | 18 | 18 | 38 | 18 | 73 |
| Upvotes on Comments By This User | 13 | 14 | 204 | 18 | 857 |
| Definitions | |||||
Why do you think you're right?
Why might you be wrong?
Star Commenter - Jan 2026
Why do you think you're right?
Historical data from the past 20+ years (obtained from Wikipedia).
| Month | Avg # hurricanes |
| Jun | 0.25 |
| Jul | 0.9 |
Approximately 43% of hurricanes are classified as C3 or higher, so the expected number of C3+ hurricanes is 0.49
The forecast can be modeled using a Poisson distribution.
Judgmental adjustments can typically be performed when information about ENSO is available, but current predictions are that we are shifting to a neutral phase of the oscillation, implying an average number of hurricanes is to be expected. [1]
Why might you be wrong?
The proportion of hurricanes reaching C3 might not be uniform throughout the hurricane season.
Do you think it would be worth a slight adjustment down on the chances of C3s based to the earlier months having cooler waters?
I checked in the historical record and, yes! 43% of hurricanes are C3 on average, but the figure is significantly lower at the beginning of the season.
I've updated my prediction/rationale accordingly.
Why do you think you're right?
You can download the data from here.
I've random-walked the most recent reading (Jan 29th = 13.47) all the way through the end of March. For each step, I naively used the average and spread of the daily change for that day in the ice extent from the past 25 years and iterated 5k walks.
The result is that the most likely outcome is an average ice extent of 14.17 in March 2026, with only a 32% chance that it will set a new record minimum.
Danger zone! I've also vibe-coded a SARIMA model that outputted a more optimistic 14.4 forecast, with just a 15% chance for a new March minimum. It's becoming increasingly convenient to use AI for the heavy-lifting on questions like these, so I'm experimenting a little bit.
I hypothesize that the path of the 2025 curve is an outlier, with significant ice loss much earlier than the spring equinox, which is why modeling with historical data suggests a repeat is unlikely.
Below is 2025 vs the 2010-2020 average
Why might you be wrong?
- The approach is a bit naive. The daily increase/decrease of ice on consecutive days is likely strongly correlated, yet I believe these correlations run for clusters of a few days, so looking 2 months forward, the effect should be well masked by the average.
- Need to double-check whether the output of the SARIMA model doesn't have particular issues
- The surface of ice melting is highly correlated with ice concentration, which is something I'm not currently accounting for. A smaller ice extent likely increases the probability that the concentration is significantly lower on average, making quick-melting events such as those seen in 2025 more likely than typical.
Why do you think you're right?
As I understand, the surge in funding that happened in January is typical. At these events, startups pitch and deals are announced. Expect higher than normal activity through March.
I don't think the crowd reacted vigorously enough at the $88 million seeded in this first month. The total so far is $253 M, and the average monthly funding in 2025 was $13.7 M, making it significantly less likely that the outcome will fall in the first bin.
Why might you be wrong?
I might be overreacting to the most recent data, though my new prediction ignores what happened so far in January 2026.
The funding uptick in Q1 is for all types of funding, so it's not specific to seed rounds. My assumption is that there's no reason to expect seed funding to follow a significantly different seasonal trend.
That said, I did not thoroughly double-check whether the claim is actually supported by data or if it is simply anecdotal. Of course, rounds of seed funding are sparse and with massive variance, so I agree that one should still assign more weight to historical data, rather than picking up false signal from the outliers.
Why do you think you're right?
A recent paper from the European Council on Foreign Relations concludes that:
It will take Russia some 5-10 years after the end of the war in Ukraine to refit and rearm for such an attack. To ensure Estonia’s security, the country and its allies need to continue developing their defence capability now and in the coming years—even if they have minimal assistance from the US. [1]
The paper examines two scenarios: outright invasion or hybrid in-and-out campaign, with the latter being more likely.
Far more plausible than Russia’s direct invasion of Estonia is a hybrid attack: a rapid, deniable operation blending local proxies, cyber sabotage and limited Russian incursions. The goal would be political shock, not territorial gain: to create confusion, delay NATO’s decision-making and demonstrate Western impotence and disunity.
One critical piece of information is that the paper claims that European NATO countries can maintain a credible deterrence even with minimal US assistance, and despite the perceived unreliability of the American partner
Two interesting forecast-related items:
- A 2016 RAND wargame concluded that Russia could seize Tallinn within 60 hours of an invasion, which the Ukraine conflict has obviously demonstrated is not the case. Right at the beginning of the invasion of Ukraine, there were claims that Putin expected to capture Kyiv within 48 hours. It's plausible that Russian expectations were off not purely because of bad intelligence or a yes-man culture in the military, but because the Western perception of Russian capabilities was also exaggerated. This could suggest that Russia will be more careful than it has been with Ukraine.
- There's a linked article containing aggregated forecasts for: 1) When Russia will have the capability to attack; and 2) When Russia might attack. [2]
2027 and 2028 are highlighted as the two years in which Russia would be most likely to attack a NATO country. Yet, many of those forecasts date back to 2024 (and even earlier), so there might have been some expectation that the ongoing conflict would have already been halted by now.
[1] The bear in the Baltics: Reassessing the Russian threat in Estonia
Why might you be wrong?
- Russian internal perception of the Baltics is being rapidly shifted by propaganda, with parallels to what happened with Ukraine just before the invasion
- Russia is isolated, and Putin is in an echo chamber. The risk for miscalculation is high.
The tone of Russian propaganda regarding the Baltic states has become “extremely crude”, aggressive and hysterical. The escalation around the Kaliningrad issue has become a key marker in shaping public opinion. Russian media have begun comparing the region to the Siege of Leningrad, creating an image of a “besieged outpost” allegedly blockaded by NATO countries, primarily Lithuania and Poland. Russian media are promoting narratives about an “inevitable blockade” and “threats of occupation”. [3]
In Vladimir Putin’s regime, a distorted perception of threats continues to prevail, driven by the growing isolation of the Kremlin elite and the lack of internal critical voices. Russia believes that it is already in direct conflict with the West and that the struggle is taking place in Ukraine, globally and ideologically. Such a perception and worldview increase the risk of miscalculation[3] The best time for Russia to attack the Baltics: experts warn, naming key indicators
Hi Nicolò (@404_NOT_FOUND )
Thank you for your detailed analysis and thoughtful comment, as always. Regarding this part:
A 2016 RAND wargame concluded that Russia could seize Tallinn within 60 hours of an invasion, which the Ukraine conflict has obviously demonstrated is not the case.
Are we sure that we could compare these two situations to decide that the initial phase of Ukrainian full scale invasion proves that they could not do that successfully in case of Tallin? Estonian territory is about 8% of the territory that the Ukrainian government controlled before the full scale invasion of Ukraine. They have about 3% of active duty soldiers (soldiers and national guard in case of Ukraine) of what Ukraine had in 2021. Smaller territory to protect but also even smaller number of active duty soldiers. Would they be able to mobilize reserves on time? Would smaller army be easier to overwhelm by a Russian attacking army with much bigger numbers concentrated in key points of the front? At the same time, Estonia has allies that could come to their aid rather quickly - I would assume air forces especially, navy, maybe paratroopers? I am no expert. They already have about 1200 military personnel from other NATO countries there. Also Tallin is at the Baltic sea shore - does it change much? How well would it be protected from the attack coming from the sea? Also in Ukraine Russia seem to have underestimated Ukrainian resolve and made some mistakes in planning (I read that they relied too much on rapid decapitation / coup-de-main (seize airfields near Kyiv, rush forces in, collapse political control) among other things). I am not saying that you aren't right, but it is more a question of if we can be so sure based on this one case. Russian army has learned a lot during this conflict, but this may not translate to choosing the good invasion plans on the political leadership level. By that I am not saying that invasion is likely - not at all. What do you think?
Why do you think you're right?
Initial forecast based on the fact that the number of successful launches each month has been remarkably consistent through 2025 and SpaceX appears to be committed to continuing to launch at a similar pace.
I've used the normal distribution of the average/spread of monthly launches to build my distribution through the bins.
Why might you be wrong?
- Haven't yet looked if there are any specific claims regarding SpaceX targeting any specific number of launches for the year
Why do you think you're right?
Here is a PDF with easily accessible cumulative monthly snowfalls. Data should be similar enough.
I'm starting with an empirical distribution that used the past 20 years of data, and I'm factoring in a high likelihood of about 1 inch of snow forecasted during the first few days of February.
Why might you be wrong?
- Naive approach that is mostly agnostic of current conditions and expected weather patterns
- The empirical distribution is non-ideal even with 30 years of data
- Snow precipitation trends are highly sensitive to climate changes and patterns might have shifted dramatically in the past few years
Why do you think you're right?
Following my initial analysis, I wanted to check whether, conditional on occurring before the end of July, a hurricane has a lower than the baseline 43% chance to be classified C3 or higher.
Copying the approach of @alter_hugo here, I checked the list of C3, C4, and C5 events in the past 50 years of the historical record, and reported below every time they happened before the end of July.
It only happened in 5 of the past 50 years, meaning that the average Jun/Jul hurricane is significantly weaker than the average hurricane in a season. This is likely due to lower ocean temperatures at the beginning of the season.
I'm still conservatively forecasting higher than the empirically measured baseline, as the available data is very sparse.
Why might you be wrong?
... I mean, with the risk of being accused for cherry-picking, here is the 2013 situation:
😒