• 0 Posts
  • 108 Comments
Joined 2 years ago
cake
Cake day: July 2nd, 2023

help-circle

  • I agree, the important part is definitely someone who teaches you how it works without going too in depth (for the beginning) and who encourages you to experiment.

    But, do you think that your feeling of intimidation stemmed from the cameras being relatively new/espensive? Like, would it have been less intimidated if the camera was 15-20 years old and accordingly cheap?

    Because today, you can get a 15-20 year old DSLR that’s still very useable and costs less than 50€, while in ~2010, there essentially were no 15-20 year old digital cameras.


  • I‘d strongly argue against a point n shoot or phone camera.

    A ‘proper’ camera can be just as easy to use (just put it in program/auto mode) and isn’t much more expensive either, if you go for something older and used, which is all a 12 year old beginner needs anyways.

    The versatility also allows and encourages experimentation, and having an actual camera in your hand gets you in a very different mind set than just snapping away on your phone.

    Not to speak of the quality difference even an older DSLR has, compared to the tiny sensor of most cheaper point n shoots and even most modern phones.


  • Definitely her own one. And definitely a proper one. Allows her to take it home, if she likes it, and keep on shooting. And also allows her to grow with the camera.

    If you’re willing to look around a bit, you can find good deals on working cameras, that of course won’t be the bee‘s knees but perfectly suitable for a beginner.

    For example, on my way to my vocational school, there’s a photo shop with a sold-as-is bin, where I got a working Sony a58 20MP DSLM with kit lens and battery for only 15€. Added a cheap charger from ebay and it’s a very decent camera for less than 25€ that‘s perfectly beginner friendly but isn’t limited to that.

    Of course, you won’t necessarily find a similar deal, but there are definitely very good deals out there, especially in the 8-20 MP range (although I wouldn’t go below 12 if you want it to at least compare to phones, resolution wise). An older cheap Canon, Sony, Nikon, etc. DSLR or similar.




  • Of course they know how to use a computer. They don’t know a thing about how a computer works but that doesn’t mean they can’t use it. Heck, my 8 y/o cousin can figure out how to open and play Minecraft on his tablet. No need for him to know about commands, programming languages and bits n bytes.

    Most people these days know how to use their phones, at the very least, and even there cog = settings. Most people don’t know how to use a CLI or how a spreadsheet program works, but they certainly can use a browser on a computer. Which is also a form of using a computer.

    And maybe they don’t explicitly know it’s a button. But they know if they tap or click on a cog it takes them to settings.

    And even figuring out how a mouse works is a thing of a few seconds, if all you’ve used before was a touchscreen (or even nothing at all). There‘s a reason they took off in the first place.

    Although, if someone truly has never used a computer in any shape or form before. No smartphone, no tablet, not even a smart TV, you‘d probably have a point that it’s not much more difficult for them to learn the common iconography than it would be to learn the CLI. But people rarely start with such a blank slate today.

    Don’t get me wrong, I don’t think it’s a good thing, people are less and less tech literate these days. But my point is, tech illiteracy doesn’t mean they have never used any computer ever and do not know what an app- or settings-icon is. I’d wager it’s more the other way around: People are so used to their devices working and their UIs looking pretty (and very samey) that iconography like cogs for settings are especially self explanatory to them. It’s the same on their phone, tablet and even TV after all.




  • Game dev salaries have increased roughly in line with inflation though, so development time still costs the studio the same as 15 years ago, while AAA game prices are only now starting to surpass the $70 mark with games not generally surpassing the $60 mark until 2020.

    It’s a wonder, they haven’t increased to prices any sooner, as much as I‘d like them staying where they were.

    And again: if you don’t like the prices, vote with your wallet, buy used or on sale or don’t pay at all.


  • Was raised roman-catholic but got disillusioned pretty quickly. I was fairly religious in elementary school but by the time I was 14, I was agnostic/atheist.

    Partially because my parents aren’t religious (my mum is from the GDR, so she didn’t grow up with religion and my dad seceded from church before I was even born) and even my grandma, who was the religious one (albeit never very strongly, compared to American catholics. More a „goes to church on religious holidays“ type of person), drifted away from church quite a bit after all the child-rapist priest shit that was uncovered at the time.

    By now (mid 20s) I’d probably consider myself agnostic. Can’t prove there is no higher power but also, if there is, we wouldn’t know what religion – if any – is right anyways. It’s probably not christianity though.



  • Yea, I don’t generally disagree. Especially if you‘re someone who plays games for hundreds of hours, instead of dozens.

    But $100 is still a lot of money for a lot of people. I‘d have to save up for months for that (I’m a trainee and have less than 1000€ per month for rent, food, internet, gas, etc.), so I rather wait until I can get games cheaper.


  • Eh, there‘s some truth to either one. Game development is expensive and pricing hasn’t kept up with inflation ($60 in 2010 are almost $90 today). But also, games are ridiculously expensive at full price, especially in todays economy and especially if they’re as badly received as Skull and Bones, while Nintendo games are at the very least usually pretty decent.

    I’d recommend voting with your wallet and only buying games on sale or used. Just wait a little. (Or pirate them, if you can live with not supporting the developers at all).







  • I’m aware stuff like that exists. I was being sarcastic. Just wanted to highlight, that searching through recent commands would be much easier in a GUI as well. Should’ve used a “/s”, my bad.

    Also, I too wouldn’t highlight Windows as a staple of good UI design. Their jumble of 4 different design languages nested into each other in the most unintuitive ways with some actions having multiple possible ways and some having been hidden away deeply is not how I’d want a GUI to be. It’s also not user friendly and very much one reason I’ve banished windows from my household.

    But, people are used to it. At least enough to find basic settings. And I think that’s the best argument against pushing the terminal. People are familiar with graphical interfaces. They understand commonly used symbols (like cog = settings and similar stuff) because all mainstream operating systems (be it desktop or mobile) have used something similar for close to 3 decades. They are familiar with menus and submenus. They don’t know where everything is, when they use an unfamiliar program/OS, of course but they are familiar with the concepts. They are not with CLIs. You are, because you have been using them for a while. So am I and so are quite a few other people who regularly use it. The average Joe computer user doesn’t.

    Even stuff like tab to autocomplete and arrow-up for history are foreign concepts for someone who has never used a terminal before. Sure, it’s not hard to learn but they’d need to learn it. Not to mention, that a lot of commands are abstract enough that they are hard to memorise and thus to understand. It’s like a language you do have to learn. Not a difficult language if you don’t need to do complicated things but it’s a hurdle nonetheless.

    Which is also why don’t like the “literally just telling the computer what to do” argument, I’ve heard a few times now. I mean, it’s not entirely wrong but it’s telling the computer what to do in its language, not in yours. You don’t type “Hello computer please update my system and programs” or even just “update”, you type “sudo pacman -Syu”. Any non-tech person will be utterly confused at what even a “sudo” is or what pacman has to do with Linux. And yes, pacman is an especially obtrusive example and Arch definitely not the distro for newbies, regardless of their stance on terminals but my point still stands, even with apt, dnf and co. To tell a computer what to do via CLI, you’ll either have to either learn its language or copy it from someone who does.

    A GUI however tries to translate that language for you already and give you context clues based on common culture (floppy = save, cog = settings, folder = directories, etc.). It’s a language even small children and illiterate people and can understand, to some extent at least.

    But yes, I do agree, the most popular distros are fairly streamlined and mostly useable without CLI. And that’s good. Makes it possible for Linux to slowly gain market share even among non technical people and I can, in good faith, recommend/install it for friends and family, knowing they’ll manage unless there’s a problem. And I do think, Linux is getting better in this regard every day, and while not on par yet with the current mainstream OSes in terms of ease of use, it’s not far behind anymore. But it is still behind.

    I’m just tired of the elitist-enthusiast who doesn’t want linux to become easier to use for the everyman because it’d be less special. That attitude does not further FOSS and does not help anyone. Because that’s not how you reduce Microsoft’s, Google’s or Apple’s influence on the tech scene.


OSZAR »