
Playing around with things, going too far, and dialing it back is all part of the creative process. The happy place is different for everyone, so just keep at it and don’t be afraid to just go nuts sometimes just to see what happens.
Playing around with things, going too far, and dialing it back is all part of the creative process. The happy place is different for everyone, so just keep at it and don’t be afraid to just go nuts sometimes just to see what happens.
I do think that this looks better than last time, especially the backdrop. Also, it’s clear that you cleaned up the dust/fuzz a bit to make the instruments pop a bit more.
If you want to go for a darker black background, you are going to have to either get comfortable with post-processing your raws or shooting with manual settings (probably some combination of both). You have the advantage while shooting still-life that your subject is not moving and you can take as many shots as you want to really dial in settings.
The reason that your black sheet ended up looking gray is that the auto setting is usually trying to normalize the average brightness to somewhere in the middle of the sensor’s range. This works in reverse too. If you had a bright white background, the auto settings would adjust to make that end up looking duller and grayer. If you are serious about getting black backgrounds still, then your best bet would be a sweep made of matte black paper. When I want to have a very white background, this kind of paper is the best there is.
Moving away from auto settings means that you are going to be playing with three settings for the most part (sometimes referred to as the “exposure triangle”). It’s really not that scary, and all the settings are basically different ways to control how much light you have in the image. I have written super basic summaries below, but I recommend finding some other reading or videos where they can show images/examples of how these things influence your photographs.
Shutter speed/exposure - the amount of time that the sensor is exposed, usually expressed in some fraction of a second. For example, a shutter speed of 1/100 means that the sensor is only collecting light for 1/100th of a second. Some important shutter speeds to know are that 1/100 is about as long an exposure as I can set and get reliable shots without stabilizing my camera in some way (tripod or similar). Also, 1/200 is the fastest shutter speed you can set and still get reliable results while using a flash. Otherwise only part of the picture will see the subject illuminated (unless you have HSS which kinda sucks but is a whole can of worms). In summary, a longer exposure/slower shutter speed means a brighter image because you are collecting light for a longer period of time.
Aperture/f-stop - this is telling you how wide open the aperture is to let more or less light through the lens and onto the sensor. This number is confusing for a while until you get used to it in that it is expressed as a fraction like f/2.8 or f/4. The larger the denominator, the less light that you let in. So, f/1.4 lets in more light than f/4. This lets you influence how much light you collect independently of the shutter speed. So, if you are shooting something like a sports game with quick moving subjects, you might want a fast shutter speed, but to collect enough light for a good image, you can open your aperture way up.
Aperture also controls one other important part of your image, the depth of field. The more open your aperture is (smaller the f-stop number), the narrower your depth of field becomes. So, lots of photographers that do things like portraits like to shoot with lenses that can have very wide apertures. This makes the subject (person) be in focus while the background melts away as a big blurry blob of bokeh. It helps separate the subject of the photo from potentially distracting things in the background. Alternatively, shooting something like a landscape often means that you want both the bushes near you and the mountains in the distance to be in focus at the same time. So, that means you need to shoot with a stopped down aperture. For the product photography that I do, I tend to shoot with a very narrow aperture so that the foreground and background parts of the product are in focus.
ISO - This is basically telling the camera how much to amplify the light that hits the sensor. So, turning up the ISO means that you don’t need to collect as much light since your camera amplifies the light that does hit the sensor, but it also means that you see much more noise in your image due to this. Cranking up the ISO is pretty common when shooting subjects at night or in dark settings (concerts, etc.). It is also another dial you can control to adjust the amount of light you need to collect to get a good image. So, if your aperture is all the way open and you can’t increase the exposure any more without a tripod, but it is still too dark? Then you turn up the ISO. This is usually the last thing I turn to when I need more light in my image as I really don’t like digital noise in my images. I try to keep my ISO <800, but this can vary a lot depending on the camera/sensor.
Is this with a flash or with continuous lighting? The reason I ask is that I do product photography for my wife’s business and I found it much easier to do with flash. There are a couple benefits I found:
I think you have done a great job with what I find to be the hardest part of product photography, the composition. In other words, the products are all laid out in an interesting and appealing way. I find that side of photography really tough and it is the part that is hard or impossible to teach if you don’t really have an eye for it.
If I were to offer some constructive criticism, its that the backdrop is not the most flattering. The uneven texture emphasizes the noise in the image, and the seam cutting through at an angle doesn’t really add anything and can be visually distracting. Similarly, some of the billowyness of the backdrop is creating light/dark areas that don’t add to the composition.
The types of products I shoot are not metallic, so I don’t have to worry about specular reflections, but it is something to keep in mind when photographing subjects like this. For example, the smooth metallic surfaces of these instruments really make things like dust/fuzz stand out. As an example, see the dial around and central bottom surface of the instrument on the left. This can be an aesthetic choice though (it almost look cobwebby), so that is why I just say to keep it in mind.
Great work! Keep it up!
Ah, I’ll have to try that out. I kind of forgot that the Playstation even has a browser.
Most of my plex users stream via Playstation, an area where Jellyfin has essentially been locked out. I let my users know how they can access my Jellyfin for the upcoming loss of remote play, but almost all of them opted to just pay the $2/mo to keep using their Playstation client.
I have a PhD in physics, primarily working on fluids and now I work in industry on fluid dynamics. Having just read the abstract, I can already tell that this paper is one of those that borders philosophical about the author’s view of their field. Nothing wrong with that though as we physicists tend to wax poetic from time to time.
The question about when we can consider turbulence solved is an interesting one. I still work in the field and for most useful applications of fluid dynamics, I would consider it a solved problem. Not to say that the NS equation is solved analytically, but rather that the field has built up a toolbox of phenomenological models and CFD systems that are more than good enough for the range of scales that we typically work with. The bigger problem for CFD in this space is optimization, an issue where GPUs have proven to be invaluable. Only in the past couple years have the major CFD software packages started supporting GPU computation, speeding things up 2-10x depending on the specifics.
I think that turbulence is an issue really at the extremes of scales at this point (very tiny, very large, small dt, hypersonic, etc.). Also, I think that it would be difficult in a system with complex forces acting on your fluid, like in a plasma where E&M forces are so significant. So, good luck all you folks working on fusion reactors!
I’m not a biologist, but if I understand your question correctly, you are basically looking for land-based invertebrates that also lack a hardened exoskeleton (like insects). This would basically consist of small, soft animals like snails, slugs, leeches, tardigrades, and tons of different types of worms.
The reason that you don’t see large examples of this in land-dwelling creatures is that skeletons or exoskeletons become way more necessary without a medium like the water in the ocean to help support a body. The rigid structure provides an attachment point for musculature to create the mechanical levers we use to manipulate our limbs.
Not sure where this falls on lemmy’s roadmap or if there is a github issue for it, but you can turn off notifications in piefed per post or comment. You can also enable notifications for posts/comments that aren’t your own if there is a thread you want to keep tabs on.
Xiaolan is such a sweet cinnamon roll. The spin-off from her POV is likely to be a lot less dramatic than the main story and more moe antics.
The theory that the lead maintainer had (he is an actual software developer, I just dabble), is that it might be a type of reinforcement learning:
If this is what’s happening, then it’s essentially offloading your LLM’s reinforcement learning scoring to open source maintainers.
Really great piece. We have recently seen many popular lemmy instances struggle under recent scraping waves, and that is hardly the first time its happened. I have some firsthand experience with the second part of this article that talks about AI-generated bug reports/vulnerabilities for open source projects.
I help maintain a python library and got a bug report a couple weeks back of a user getting a type-checking issue and a bit of additional information. It didn’t strictly follow the bug report template we use, but it was well organized enough, so I spent some time digging into it and came up with no way to reproduce this at all. Thankfully, the lead maintainer was able to spot the report for what it was and just closed it and saved me from further efforts to diagnose the issue (after an hour or two were burned already).
Wow, I hadn’t realized until you pointed it out that you can’t delete pm’s (I guess without getting admins to fiddle with the db). I still use my lemmy account to moderate some lemmy communities, but I am appreciating using piefed as my threadiverse consumption platform more and more.
Welcome to lemmy! I just wanted to shout out Piefed, a fediverse software similar to lemmy (that I am posting from) that is actually written using flask!
I am very new at using flask, but have found that the flask community on reddit has been one that I have had to dip into from time to time as I have been noodling away at a project. So, it’s nice to see one on here as well.
You have clearly never driven on 93 through Boston where the person you replied to said they are from (aka the Big Dig). It is basically an entire highway that is underneath the city. There are many on and off ramps, lanes suddenly become exit only, complex multi-lane exits that branch…it’s intimidating. As somebody that has lived in the Boston area for 15 years now, I still mess things up.
Official response from Greg Bernhardt
It’s years since I last used PhysicsForums, but found it immensely useful in the old days while going through my undergrad physics degree (it was less useful for PhD courses). I am not morally opposed to providing AI attempts at an answer in threads where nobody else chimes in. However, using real accounts that belong to other users is wildly over the line. I was surprised to see this wasn’t really called out in the official response thread by the existing users as that is the part of all this that is the most egregious to me.
IIRC, piefed’s private votes are disabled for “trusted” instances. You can see which instances are trusted here.
I have a PhD in and am a practicing physicist in the field of rheology. I think this is an interesting way to explain viscoelastic materials to people. My go-to example is usually Silly Putty, but cats are something that just about everybody has some experience with.
I guess it depends what you want to do with it. I do lots of product photography, so I know exactly what lens I need for my studio and the type of product I am shooting. So, I spent about as much on a lens as I did a body. Getting a better sensor with more accurate colors saves me time in the post-processing step.
When I was starting out, I just used a kit zoom lens, but realized that most of my shots were around the same focal length. So that is when I invested in a faster, nicer prime lens at that focal length.
Also, this is neither here nor there, but I have been trying out piefed lately and it’s pretty cool. Thanks to the devs over there, the anime community should be a lot more compatible with piefed going forward thanks to squashing a couple bugs I found from our use cases. Specifically, piefed users should now enjoy:
I just looked at the first PR out of curiosity, and wow…
That’s the part that surprised me the most. It failed the existing automation. Even after prompted to fix the failing tests, it proudly added a commit “fixing” it (it still didn’t pass…something that copilot should really be able to check). Then the dev had to step in and say why the test was failing and how to fix the code to make it pass. With this much handholding all of this could have been done much faster and cleaner without any AI involvement at all.