I’ve played the crap out of both; they’re really good.
I’ve played the crap out of both; they’re really good.
And that’s entirely valid; like I say, stardew gameplay is immensely satisfying in and of itself.
I just feel like all these other mechanisms in arpgs are thrown on top to try and disguise the nature of the thing, and it’s that disparity that leaves people jaded.
Stardew doesn’t have an endless progression of increasingly fell and eldritch vegetables that need you to constantly grind for upgrades just to tend to them. You water things in one click all the way through, and that feels good; you don’t need to chase a sawtooth pseudo-progression in order to be satisfied.
Stardew doesn’t make you do NP-complete multi-knapsack-problems in order to even have a viable character, or drown you in overly complex interactions so you can’t usefully plan in your head; there’s complexity there, but of the kind that opens up more options.
It manages to be fun without those things, but ARPGs seem to overwhelmingly rely on them in order to be engaging at all.
Why is that?
Why does gory-stardew need all those external obfuscations, when the normal kind doesn’t?
How could you make a gory-stardew that’s comfortable in its own skin?
I have absolutely no wish to dumb them down.
As I said, if you just took away all those mechanics, you’d be left with a boring empty game.
What I said was that it would be nice if you could make the combat feel more like hunting than gathering, so you wouldn’t have to make up for it with a:) number-go-up and b:) np-hard - then you could then go for much more enriching forms of complexity.
For instance, making mobs fight a lot more tactically as their level increases instead of just stacking on the HP and damage - and instead of your perks just driving stat inflation, they unlocked new tactical options on your part, giving you a series of new stops to pull out as the battles got more fraught.
I dunno if it even needs to be difficult; even a bit tactical would change the nature of the thing. As it is the mobs in these things tend to be mindless converging waves; what if they set up set pieces, ran for help, dived for cover, used supporting fire etc etc?
Also perhaps overambitious, but what if the difference between low and high level enemies wasn’t their HP or damage, but how tricky and organised they were? What if leveling up didn’t make number get big, but instead gave you more options in a fight?
Ah, perhaps a slight miscommunication.
though I do enjoy traditional roguelikes, I’m not looking at the stakes or the intensity, but rather the kind of itch that’s being scratched in diabolikes, and it feels a lot more completionist/procedural in nature than it does adversarial.
Both are good, but dressing one up as the other can lead to an underlying sense of disappointment.
Resources and influence will always drunkard’s-walk into the hands of the unscrupulous and manipulative, pretty much by definition.
They’re going to be drawn to it, they’ll fight dirtier for it, and they’ll use the power it gives them to prevent anyone else from taking it away.
Big Tech is a huge source of both, so it would be amazing if the people on top of the heap weren’t massive piles of shit.
I do know about window managers, thanks.
And that’s part of the problem: they all have their own slightly different infrastructure that relies on slightly intricate and not-quite-standard plumbing.
Dialogs not opening, or those weird invisible 30-second timeouts opening an application becasue dbus isn’t happy because one of the xorg init scripts messed some XDG path or set the wrong GTK_* option, or XAUTHORITY is pointing somewhere weird.
Whichever user is logged in locally should be allowed to talk to the device they plugged in via usb? Well that’s just an unreasonable thing to expect to happen by default, let me spend 20 minutes cooking up a udev script to chown it on creation.
Users managing to set their default terminal to some random script they were working on (seriously, how?). Or they initialised their xfce4 profile with the blank-toolbar option and now can’t work out how to launch anything.
Notification popups? Sure, the toolbar will let you add one, but nothing communicates with it by default lol.
also jesus christ kde.
And I’m talking about the built-in functionality of the desktop environment wrt package management, not separate applications.
Sure, it’s nice to be able to apt-get upgrade and just get everything all at once - when everything is happy with everything else.
But when you get conflicting dependencies and you have to take time out to track down what libpyzongo0-util is used for or what is going to break later on if you just purge it because people use cutesy package names that are worse than Ruby libraries in terms of communicating what they’re actually for, and do we need this thing for the core platform or it it form some random crap that was installed ad-hoc and used precisely once, it gets old.
Like I say you need this amount of flexibility and complexity for development and deployment and network services and all the rest. Anyone using Windows for much more than file-print-office-browser-gaming has more masochism in them than I can comprehend.
But for that same very minimal set of core use-cases, you don’t need (or, I’d argue, want) flexibility or complexity, you want it to be simple and robust with JOWTDI. And for everything else, you ssh into your linux box and do it there. I was amazed to discover that Windows Terminal is actually really nice; combine that with an X server and maybe a VNC client, and you’ve got the best of both worlds.
And yes, Windows has all kinds of annoying shit of its own - but that mostly pops up when you want to do interesting things on it, not when you just want to look at cat videos on the internet.
I’m a sysadmin. We’re a Linux shop, I spend my life deep in the guts of Linux boxes, both server and desktop.
And for my daily-driver both at work and at home, I use windows.
The UI and overall UX are just better. The annoying bullshit I make a living knowing my way around, I don’t have to think about.
For actual development or backend services, of course you want a Linux box. Proper logging, proper tools, build shit, pipe it together, automate stuff and get down and technical when it breaks. Doing that on windows is absolutely hell.
But on windows, the volume control just works, I never have to delete lockfiles to get my browser to open, my desktop login doesn’t terminate if something in .profile returned nonzero, I can play every video game out there without having to fuck around, I can use native versions of real apps, I don’t have package-management dependency hell, all the pieces were designed to work with each other, and the baseline cognitive load needed to just use my computer is zero, which frees up my brain to focus on my actual work, or for playing games and fucking around on the internets.
Let me guess, they laughed at trump?
What do you think an algorithm is?
He says, on social media
How complex does a neural net have to be before you can call any of its outputs ‘pain’?
Start with a lightswitch with ‘pain’ written on a post-it note stuck to the on position, end with a toddler. Where’s the line?
How about a law stating that the terms of all contracts with corporations be listed in a public registry?
You want the state to enforce the terms of your agreement, you put that agreement out in the open. No secret laws, and no secret laws-by-proxy.
If you want something decent, check out ‘Lauren Ipsum’, by Carlos Bueno.
THe phrasing of the message implies that you’re still subject to eg. your employer logging your network access, and third-party sites logging your IP - both of which would be physically unavoidable and not within the browser’s ability to control.
It very carefully avoids saying ‘we’re still selling your identity and browsing habits to ad companies and dataminers even though we could totally prevent that lol’.
Universities aren’t there to teach marketable skills, and they never have been.
In fact they get quite snotty about the distinction; they’re not some trade school, ugh.
They go and market themselves as employment-enablers, because that drives enrolments which drives funding, but a large percentage of adademics see undergrads as a vexing and demeaning distraction from their real work of writing grant proposals. Which to be fair is what their whole career (and the existence of their employer) depends and is judged on, so…
The other thing is that there’s two skillsets involved here: learning to use a specific set of tools and techniques to produce a desired outcome (the trade part), and learning to wrestle large, unwieldly and interconnected tasks in general, while picking up the required specific knowledge along the way (the adademic part).
Teaching just the trade part gets you people who are competent in narrowly-defined roles for now, but it doesn’t necessarily get you adaptable, resilient, bigger-picture people with common sense and a strategic outlook. Teaching just the academic part gets you people who aren’t necessarily productive right now, but have a lot of potential wherever you put them.
Employers would like to hire people who are both. They’re also lazy and cheap, and will use anything they can get their hands on as a resume-filter because they aren’t willing to put time and money into usefully evaluating someone’s potential usefulness as an employee. If they can farm that off to the universities to do (and the students to pay for), they’re happy to let a degree stand in as a not-chaff marker they can require of all their candidates. It’s like bad video game designers using bullet sponges to ‘increase the difficulty’.
Teaching CS is important and useful, but the benefits only really pay off longterm - apart from the bullet-sponge factor.
Teaching programming is important and useful, but the benefits can be short-term and dead-end.
If you only pick one… depends on whether you can afford to eat while those nebulous long-term benefits slowly kick in.
Universities should communicate these things better, and employers should be incentivised to stop using junk degree-requirements to offset their laziness and incompetence. Make it so for every position they require a degree for, they’re taxed the tuition fees for that degree every year.
He says, on the internet.
This is pretty much the textbook definition of moral panic.
It was zero degrees today, and it’ll be twice as cold tomorrow.
Obviously ideas of fun vary; people are allowed to enjoy things I don’t like :)
Also I’m not rampantly disagreeing with you here, just picking at the edges for discussion because it still doesn’t sit quite right in my head.
It’s just… sometimes I feel like the implementation of complexity in these things is just kind of lazy, comparable to adding difficulty by making enemy bullet-sponges. It’s certainly more work to defeat them, but is that work rewarding?
Consider the annoyance that triggered this whole post.
In grim dawn, mid way through elite. I had some gloves with fairly miserable specs for my level, but they were providing most of my vitality res. Can I change them out?
Well there’s some with better overall specs but no vitality but they do have a lot of fire res, so I could swap those in, then the ring I was getting lots of fire res from could go, and there’s one with some vitality but unfortunately no poison, so let’s see, I do have a helmet that …
spongebob_three_hours_later.jpg
… but now my vitality is three points too low to equip the pants, oh fuck off. How is this fun?
Finding a reasonable solution doesn’t make you feel clever, and making an awkward compromise doesn’t feel like a justifiable sacrifice, it feels like you finally got too exhausted to search through more combinations and gave up. You can’t really look forward to getting better gear to fill a gap, because you’re going to have to go round and round in circles again trying to build a whole new set around the deficiencies that come with it.
It’s like debating against a Gish Gallop - taxing to keep up with but without any real sense of achievement.
And honestly it doesn’t feel like that’s really intended to be the real gameplay. If the genre is really a build-planning-combinatorics game with a bit of monster-bashing on the side, where’s the quality-of-life UX to go with it? Where’s management tools to bring the actual problem-domain to the fore? Where’s the sort-rank-and-filter, where’s the multi-axis comparisons? Where’s the saved equipment sets? Why is the whole game environment and all the interface based around the monster-bashing, if that’s just the testing phase? And if navigating hostile UX is part of the the challenge, then again I say that challenge is bad game design.
And all the layered mechanics across the genre feel like that: bolted on and just kind of half-assed, keeping the problem-domain too hard to work on because of externalities rather than the innate qualities of the problem itself. I know, let’s make the fonts really squirly and flickery so you can only peer at the stats for five minutes before you get a headache, that’ll give people a challenging time constraint to work with.
Did you ever play mass-effect: Andromeda, with the shitty sudoku minigame bolted on to the area unlocks? You know how that just… didn’t make the game fun?
That.
Also it seems to me that if the prep-work was really the majority drawcard, we’d be seeing a lot more football-manager-like tweak-and-simulate loops, if that’s what they were going for. Build your character, let it bot through the map (or just do an action montage), then come back with a bunch of loot and XP to play with before sending it out again.
I think an ideal game would hit all three kinds of satisfaction: tactics/graaagh, exploration/harvesting and mastery/optimisation. And ideally, each of those three targets would be free of external complications and left to focus on their own innate challenge and rewards.
I know that’s easy to say and hard to do… I’m just surprised that we haven’t got signficantly better at it in the last couple of decades.