• 0 Posts
  • 36 Comments
Joined 1 year ago
cake
Cake day: August 6th, 2023

help-circle
  • The most elite trackers perhaps.

    Trackers on /r/opensignups ? Nah they open their doors to the public every now and again.

    Would not recommend it to anyone who can’t dedicate a seed box or machine uploading torrents most hours of the day every day. It’s possible to do it without those but difficult. With them it’s merely a matter of using free leech and building a buffer up as well as taking advantage of points systems to get free upload just for keeping torrents seeding even without uploading.

    If you only ever grab free leech then all you have to worry about is meeting seed time and activity requirements like logging in every 90 days.

    An old computer with an external drive. A raspberry pi, a nas that can run a BitTorrent client. Any would work if one doesn’t want to pay for a seed box. (Most trackers ban shared seed boxes though so you will have to get dedicated)


  • No.

    HDMI does have a feature called Ethernet over HDMI that in theory could allow that.

    Thing is though it’s literally never been implemented in anything. It died because cheap WiFi became common.

    For it to work you’d need both the TV and Chromecast and HDMI cable all to support it. It’s not uncommon on cables and a surprising amount of them include it in features list (probably to trick low info people).

    But I believe that’s a hardware design thing so not something even a software update could enable. It costs extra money and they’re already paying for a WiFi chip so why bother?


  • Just FYI. Comments nearly exactly like yours on Reddit were used in copyright troll lawsuits against ISPs as evidence they didn’t do enough to enforce copyright and were negligent and legally liable.

    Further when that didn’t work the copyright agency sued Reddit to try to unmask the identities of those people to bring legal proceedings against them to coerce them into testifying against their ISP at threat of being in trouble for their activities. Reddit was big enough to fight off the lawsuit luckily but be careful.




  • Remind me again how did we get those free speech rules on college campuses in the first place? What was it in response to? Oh right the inability to protest the Vietnam war and the demand from students of that era and the one that followed to be allowed to protest.

    Only aside from Iraq which didn’t last that long to my recollection every major protest movement has been against either enemies of the US or those the US is ambivalent about like South Africa.

    So this “right” has never really annoyed the powers that be until now with the pro-Palestine solidarity stuff and now they have to crush it and now all the pretensions of liberalism to caring about free speech and expression and thought are being put away and we’re assured these people represent hate and must be crushed.

    Seems like we never really had that right. It was a concession given then never expected to be used again and now that it has been there’s outrage and the iron fist.




  • Politely agree to disagree and I’ll elaborate. Thanks for your input.

    LTH are all marked as such. MABL normal (non LTH) discs such as verbatim sells for less than half the cost of M-Discs have the same physical properties as M-Discs, the protective layers are the same, the recording methods are the same using the same materials. Therefore the longevity is the same or near the same without getting into M-Disc’s ridiculous marketing claims of 1000 years (when NIST and others agree the poly-acrylic protective layer would degrade and decompose after a century or two at most even in ideal circumstances).

    /r/Datahoarder has had this argument several times and the consensus so far seems to comes out to the fact that M-Discs were a DVD-era innovation that in the BD era offer no meaningful advantages in technologies.

    I’d rather have two BD’s from a reputable company like Verbatim (not fly by night plain white discount bulk BD’s from who knows where) from separate batches bought 6 months apart stored properly than rely on one overly expensive M-Disc that isn’t going to last any longer and probably isn’t made to meaningfully tighter tolerances.

    NIST only estimates the lifetime of M-Discs, real world abuse tests on BD’s (non LTH, should have mentioned that to be honest) show good endurance that far exceeds DVDs. It comes down to however burning it right and storing it right. A pile of M-Disc left in a window in your uninsulated garage year after year and burned at 16x are not on the whole going to be in a better state in 20 years than a pile of BD-R’s burned at 4x, stored in protective sleeves in a case in a temperature controlled, insulated environment. Add in having a back-up copy and the chances of total data failure on both primary and backup disc and you’re looking at better survivability. NIST numbers generally assume things like storage in archival quality environments such as old salt mines which are a controlled environment, low humidity, neither excessively hot or cool and not subject to shifts in temperature. Most people can’t store things in an environment like that and those who can usually have the finances for a better solution like multiple tape copies and/or continually updating and refreshing hashed/checksumed files and moving on a schedule to new better storage mediums (e.g. keeping files in a raid array in a plugged in NAS, checking for failures regularly, replacing disks and upgrading disks every 5-10 years one at a time).

    I wouldn’t trust any media not professionally stored in a purpose-built archival environment and with at least two copies to last more than 25 years without degradation or loss. Anyone trying to store stuff really long-term and cannot afford degradation or loss needs to have a plan to update their archival copies every 15 years or at least do an assessment that often and survey the options as well as the physical and ideally logical state of their chosen back-ups.


  • M-Discs had merit in the DVD era. It’s a common refrain of those who don’t know the intricacies and read a wired article years ago to claim they mean anything in the Blu-ray era. They don’t.

    Standard Blu-ray Discs have all the technologies that supposedly make m-discs so long lasting and as far as media that isn’t continuously updated and hashed from live storage medium to live storage medium (cold, archival storage unpowered) they are about as good as you’ll get.

    They are much tougher than DVDs. Of course a variety of things go into how long a disc remains readable and without damage to data including luck with regards to no impurities in the batch. Even m-disc themselves based their longest claims off storage in ideal situations like an inactive salt mine (commonly used for archives by governments). Kept out of sun, away from extreme heat (including baking in uninsulated 120 degree F heat all summer year after year), away from high humidity and away from UV exposure to the data side of the disc as well as scratches and such and they should last a quarter to half a century, some more.

    In the Blu-ray era m-discs are just an overly expensive brand.




  • DVD’s max out at about 580p (for PAL, NTSC is 480p), resolutions are measured by the number of horizontal lines of pixels (counted from top to bottom of video/screen), not vertical which at 4:3 square aspect ratio on dvds does tend to be 720 pixels (by contrast full resolution HD video’s number of vertical lines is 1920 while it’s horizontal lines are of course 1080, hence 1080p). You’re not the first person to be confused by this.

    Professional encoders who fully understand the encoders and the schemes in use and care about not seeing artifacting or low quality would never intentionally go as low as 300mb for a feature length movie of even an hour. Yes there are people who do such things but they’re not well regarded and it won’t look even passable on anything larger than a phone screen.

    Recognized quality groups that seek low sizes might get an animated feature (less bitrate needed due to lack of fine detail in animation vs real film) in SD quality down to around that. But for most live action content the sizes I see from the best of the best concerned with smaller release sizes are in the 900mb to 1.5GB range for 60-90 minute features.

    300mb for a 90 minute live action feature even in SD is just not going to look good, some of the groups who get those sizes make them look even half-passable by running pre-filters in virtualdub that smooth, reduce grain and detail, etc before passing to the encoder. That kind of thing is way beyond anything you’re going to learn in a few youtube videos though, that’s advanced stuff with scripting.

    Think about it this way, if you shoot for 1GB encodes with 265 or AV1 you can store over 900 movies on a 1tb drive which can be had for well under a hundred dollars.

    I would like the best and fanciest algorithms to have least dataloss.

    There is no magic that will get you where you want. If you want detail preserved you need more bitrate which translates to larger sizes. Modern codecs like HEVC and AV1 mean you need as much as 1/5th the bitrate you needed with old MPEG2/4 encoding schemes used on DVDs, that’s darn good savings but it has its limits.

    Do as you will but anything live action (non-animated) significantly under 1000kbps average bitrate is going to look awful on a 1080p screen and much worse than what it would look like if you popped your dvd in the disc drive and played it from there.

    Opus is fine if you’re not worried about compatibility and just playing on a computer.


  • I really like the one I have. A relative has a much older model and it still works fine too.

    It’s very responsive and the 4k models are quite powerful and future proofed IMO. If you have an iPhone you can quickly use it as a remote too.

    Paired with infuse app it even does local streaming from my media server well.

    And it’s cheaper to get this year’s top of the line Apple TV than it is the 2019 Nividia shield pro.


  • Majestic@lemmy.mltoAsklemmy@lemmy.mlProblem with Lenovo B50
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    6 months ago

    Well re-applying thermal paste is a big matter. I’d try stressing it and seeing if you can force it after cleaning the vents.

    So do things to stress the processor and see if you can force it to happen repeatedly. If it is a thermal issue you should be able to cause it by inducing high load. If you can’t it points more towards other issues. A fault in RAM or the motherboard for example, a loose module or connection, etc.

    And just FYI if you need more help in future this community isn’t intended for it. Try posting in a tech or computer help community for a better chance more people engage.


  • Randomly or after a set time of use or intense use? Could be processor is overheating past safety thresholds (new thermal paste could fix this IF that’s the issue).

    I’d try cleaning vents and fans before changing thermal paste though. Could be they’re clogged and not working right.

    Failing that it could be multiple things. But I’d try to address cooling being adequate first. Make sure vents are unobstructed during use. Use it on a flat hard surface, clean the vents with a vacuum and/or canned air duster.


  • As others mentioned having a good encoder is an issue for AAC. And some skills in using it, tuning, etc.

    Nearly all quality releasers now use AC3/EAC3 or FLAC. Tigole is the last one who uses AAC to my knowledge and the rest of the QXR group rolls their eyes at it.

    You’re not going to get a meaningful reduction in bitrate and file size with AAC over EAC3/AC3 without loss of quality. We’re talking maybe you can shave 2-300kbps off an AAC version versus an AC3 5.1 track. And it’s tricky. So much so no one other than that one person I mentioned bothers. At least no one accepted in the higher echelons as competent in creating acceptably transparent encodes.

    If a source has EAC3 (itself capable of up to halving the bitrate required vs AC3) or AC3 I’d recommend keeping it as they tend to already be efficient. They’re also universally compatible as codecs. Re-encode those big 1500kbps DTS tracks and those even bigger monster lossless Dolby and DTS tracks but I’d leave efficient codecs like AC3 alone.

    That said it’s up to you what sounds good. If you’re using lower end stuff and can’t tell the difference after trying a few different test videos with different types of sounds then go for it.




  • Don’t bother with M-discs. They only provided a meaningful advantage in the DVD era. I’ve researched this a bit myself and consensus at least in the data hoarding community is use 2 Blu-ray Discs from two different batches (bought 6 months apart). Which still comes out cheaper or the same as branded M-Discs. (Though that may be overkill and truth be told as long as you test the disc and it’s data done months after writing you’ll tend to catch any rare bad ones)

    Truth is, quality Blu-ray Discs have all the features that would engender M-disc type longevity in the design spec. Just make sure they’re not low to high (LTH) discs which are inferior but always marked as such at least.

    Don’t get no-name cheap ones either, get Verbatim, Sony, some other good Japanese brand. For Verbatim specifically their discs marked MABL on the package are better.

    Always burn data at lower speeds too, less errors.