• 7 Posts
  • 18 Comments
Joined 2 years ago
cake
Cake day: July 24th, 2023

help-circle

  • Not entirely correct, I reckon that De Grauwe uses a fictional example to assert his point.

    For the exact measures that are currently on the table, I can refer you to this VRT page, which says:

    • exemption of 10.000 euros in profit
    • a tax of 10% on all profits above 10.000
    • And then the exemptions for the “strongest shoulders”: For people who own more than 20 percent of the shares of a company, there is an exemption of 1 million euros. The added value between 1 and 2.5 million euros is taxed at 1.25 percent, that between 2.5 and 5 million euros at 2.5 percent and that between 5 and 10 million euros at 5 percent. Only from a profit of 10 million euros you pay the ‘full’ added value tax of 10 percent.









  • There might indeed be ways around the filter, e.g. a stable, non-exploitative society but they would never reach space. The filter might indeed not even exist, space could indeed still be young but I’m not very convinced. If space were young, and if it were to expand as it currently does, civilizations would have fewer opportunities as there would be fewer visible stars to explore. As time grows, chances get smaller still.

    Let’s say humans do cling on. I believe they will face challenges that are too steep to make long-term survival probable. Not only the heavily pollution and the unlivable climate, but the depletion of basic minerals will probably prove too great an obstacle. That band of humans must have held on and maintained all current technology, and have sufficient power sources, to be able to do some deep underground mining, as all easy-to-reach minerals have already gone. Without technology or those minerals, I’m not sure how we’ll be growing food or cleaning the air to breathe.



  • I believe we’re seeing a universal law in action: any technologically advanced civilization will end up destroying itself. Whether it’s the warming due to extracted fossil fuels, or a nuclear war, or AI, …, there is, and must be, a seed of destruction in every advanced civilization. I purposefully say ‘must be’ because of the Fermi paradox, which should indicate to us all how any sci-fi future is forever beyond our grasp.