r/Bitcoin - Easiest way to solo mine BTC in 2019 on a x86 ...
Part 1: Hardware Requirements CryptoSource
Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.
I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom. …Only problem: much of what they say is wrong. There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other. Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.
“PCs can use TVs and monitors.”
This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up. I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080. I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.
“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."
Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC. Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go! Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered. Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy! Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way. Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.
“On PC you could use Steam Link to play anywhere in your house and share games with others.”
PS4 Remote play app on PC/Mac, PSTV, and PS Vita. PS Family Sharing. Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console. In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system). PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game. Need I say more?
“Gaming is more expensive on console.”
Part one, the Software This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks. Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new. Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount. Part 2: the Subscription Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right? Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly. Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee. Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts. Let’s look at PS Plus for a minute: for $60 per year, you get:
2 free PS4 games, every month
2 free PS3 games, every month
1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72freegames every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month. In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still. All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts. Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst. Part 3, the Systems
Xbox and PS2: $299
Xbox 360 and PS3: $299 and $499, respectively
Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off. Well, keep in mind that the generations here aren’t short. The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total. And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention. Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware. Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually. Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines). Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway. Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.
“PC is leading the VR—“
Let me stop you right there. If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold. Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone. If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC. Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR. …Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.
“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”
This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam? GTA V
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis. But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right? No. Not even close. iRacing
CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
Memory: 8 GB RAM
GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games. Subnautica
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting? Low-end PCs. What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers. Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars. I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:
“PCs are more powerful, gaming on PC provides a better experience.”
This one isn’t so much of a misconception as it is… misleading. Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners). Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle. These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up. Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that. Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance. Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X. Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…
“You pay a little more for a PC, you get much more quality.”
The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time. For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
1.35 GHz base clock
2 GB VRAM
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs. Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
1.29 GHz base clock
4 GB VRAM
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part. But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance. The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
1.5 GHz base clock
3 GB VRAM
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much. Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story! Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
1.5 GHz base clock
6 GB VRAM
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story. I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99. Well, let’s see what Tech Power Up has to say... 94.3 fps. 74% increase. Huh. Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
1.6 GHz base clock
8 GB VRAM
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world? Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story. You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option. In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X. On another note, let’s look at a PS4 Slim…
800 MHz base clock
8 GB VRAM
…Versus a PS4 Pro.
911 MHz base clock
8 GB VRAM
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here. It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games. …That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7. The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.
“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”
Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team. This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough. On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder. Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them. Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion. Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.
“There are more PC gamers.”
The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million. Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent. For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales. But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million. This isn’t uncommon, by the way. Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total. EDIT: There were other examples but... Reddit has a 40,000-character limit.
This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform. I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across. I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, thisisn’t “anti-PC gamer.” If it were up to me, everyone would be a hybrid gamer. Cheers.
I'm a fan of bitcoin and like to support the bitcoin by buying and holding BTC and operating a full node - a bitnodes unit that consumes a very minimal 2.5 Watts. Just for fun I'd like to run a small miner drawing about say 20-50 Watts tops, I know this will not be profitable in any way but would like to do it anyway more as part of a hobby than anything. I've mined litecoin in the past using a bank of 3 graphics cards drawing about 750W total and it was a bit impractical for me to be honest due to the noise and heat they generated and the uncompetitive price I pay for residential electricity. I figure something that only requires minimal cooling and therefore has no heat and noise issues would be ideal. Maybe a USB miner plugged into a raspberry Pi? (since I already run a couple of Pis). It would be nice to have something fairly competitive in terms of hashes/joule efficiency. I see the ant miner S7 currently leads by a wide margin with about 4000 MHashes/Joule. See Mining hardware comparison
Good Day! Now I am fairly new to bitcoin mining, I've just discovered that my graphics card isn't mining more than 0,00 something something, basically it's not going to pay much off. So I started looking into the new monster hardware coming in the near future, 2014. https://en.bitcoin.it/wiki/Mining_hardware_comparison Obviously there are many choices of hardware that will pay off ridicolously much more than that old nvidia or AMD card which is stuck in the average joes computer. Of course the investment itself does cost, but considering the current bitcoin price of 710$, there is a high probability of making big bucks. As for the mining rigs coming in 2014 it says some are coming in Q1 and some are coming in Q2, the first and second quarter. And one of the questions an investor might have would be, which one is the best to preorder based on delivery. For example, the butterfly monarch http://www.butterflylabs.com/monarch/ Which it says comes out January / February or the Synapse Terra-1 http://axonlabs.net/?l= which says "Production / Shipping will start in Q1/Q2 2014. Since units are purchased faster than they can be produced" My question is with all these options, which one to buy if you want it delivered as fast as possible. And which one is the best choice.
I've posted this information a lot recently to new miners with NVIDIA cards. This subreddit seemed like the right home for it, and hopefully this is will serve as a helpful starting place to clarify the very basics and get people started. As always, watch your GPU core temperatures closely. Lower hash rates correlate to lower operating temperatures. Play with these features to adjust your hash rate according to the load your GPU can handle. For example, one of my cards has better cooling than the other, so I run them at different hash rates to keep them both in the temperature range that I'm comfortable with. Getting Started (Windows Environment):
Download and install the latest NVIDIA Drivers.
Download and extract the latest version of cudaMiner (SEE BITCOINTALK - CUDAMINER LINK BELOW) .
Create a new text file in the same directory as cudaminer.exe (x32 or x64, depending on your system).
Open text file and enter your configuration into the new batch file (See below for samples. Change settings to match your specific set-up):
Change text file extension to .bat
Execute batch file (not the executable).
To exit, CTRL+C to break, wait, then Y to exit OR press the "Red X"; If the the command window closes immediately, add "pause" to the end of the batch script to view the error.
If running x64 version, try x32 version and compare results.
Command prompt window flashes and closes.
Usually indicates bad syntax or attempting to launch executable. Review batch script settings. Add "pause" to end of batch script to view error.
Stratum Authentication Failed / "HTTP Request failed; No Error"
Indicates connection issue. Review server address & user credentials.
Memory error / Result doesn't validate on CPU / Error 30
You are launching the executable; you cannot do this. Create and launch using batch script instead.
:::Sample Configurations (EDIT TO MATCH YOUR SPECIFIC CREDENTIALS & GPU SETTINGS) :::Single GPU ::SingleGPU.bat cudaminer.exe -i 1 -C 1 -m 1 -H 1 -l auto -o stratum+tcp://YOUR.POOL.ADDRESS:#### -O USER.WORKER:WORKERPW :::Multi GPU, Multi Command Prompt ::GPU0.bat (Address/Login for Standard Pool Mining, 1st GPU) cudaminer.exe -d 0 -i 1 -C 1 -m 1 -H 1 -l auto -o stratum+tcp://YOUR.POOL.ADDRESS:#### -O USER.WORKER:WORKERPW ::GPU1.bat (Address/Login for P2Pool Mining, 2nd GPU) cudaminer.exe -d 1 -i 1 -C 1 -m 1 -H 1 -l K4x32 -o stratum+http://YOUR.P2POOL.ADDRESS:#### -O WALLETADDRESS:ANYPW :::Multi GPU, Single Command Prompt ::DoubleGPU.bat cudaminer.exe -d 0,1 -H 1,1 -i 1,1 -l K3x9,K4x32 -C 1,1 -o stratum+tcp://YOUR.POOL.ADDRESS:#### -O USER.WORKER:WORKERPW
FLAGS ARE CASE SENSITIVE
Call to execute cudaMiner
-d (Any, counts from 0)
Only for multi-GPU configurations: create multiple .bat files or use comma separated values.
When enabled, it reduces GPU utilization and hash rate to allow for computer use during mining
Enable Texture Cache
(Disabled, 1-D Caching, 2-D Caching) may increase or reduce hash rate, available according to your compute capability - check WIKI CUDA LINK BELOW
Consolidates hash work into a single memory block and can lead to lower memory usage. Is implicitly enabled with Texture Caching.
(CPU Only, CPU Assist, GPU Only) determines how much work will be shared by the CPU. Defaults to GPU Only (2) if not specified.
Autotune, autotune for card generation, or specify particular setting. Defaults to autotune if not specified.
Full URL of the mining server you wish to connect to.
Username (or Username.Workername for pools) & password pair for the mining server for your device.
Verbose output to view block/warp chart and test a configuration.
NOTE ^ : This option is they key to tuning your hash rate and resulting GPU temperature. Choosing "auto" will enable autotuning, allowing cudaMiner to choose the best config. Choosing just "G," card generation code, will autotune for that specific card generation. "GBxW" is the specific setting you choose for the card where "BLOCK" is the row #, "WARP" is the Column number in autotune chart. Your BLOCKxWARP value should not exceed your maximum core configuration (WIKI GPU LIST BELOW), otherwise cudaMiner will crash/return error. For best results, the BLOCKxWARP value should be an exact multiple of your core config.
-For example NVIDIA GT 750M, Kepler card, row 4, column 32 is K4x32 (4x32=128). This is exactly 1/3 of and does not exceed the max core config of 384).
NOTE & : Autotuning reported hashrates are not always accurate, but you can use the results in the benchmark table to choose the ballpark hash rate you desire. If you define a setting or allow it to complete the autotuning, it will then begin the benchmark and show you the average hashrate once you end the sesion (CTRL+C). Before closing the command prompt, you can scroll back up and save a screenshot of the block table. Type "Y" after ending to close the program. Remember to turn OFF the flag -D --benchmark after you are done. This is benchmark mode; cudaMiner will not connect to the pool until you remove this flag
Extreme newb here, wondering why my gpu hashrates are so slow?
With the recent boom in bitcoin I wanted to learn more about it by mining solo on my PC just for a fun experiment. My graphics card is a Geforce GTS250. I realize this is diddly-squat compared to any reasonable mining rig but I just wanted to try as a test. I know that in all probability I will never find a block but just the small 'lottery' appeal of finding a solo mined block is enough for me to test it out So I setup a bitcoin wallet and downloaded GUIMiner. So far I have been able to launch bitcoin QT as a server and start mining on GUIMiner as a solo miner using my GTS250 as the device. https://en.bitcoin.it/wiki/Mining_hardware_comparison lists my card as being able to get 35Mhash, but on GUIMiner it says I am getting 650khash. I was wondering if anybody could tell me why there is such a big difference in those two numbers? Is there some important setting that I do not have correct? Sorry for the newbie questions but any help would be appreciated!
I ordered this graphics card and I saw the saphire alternative on this site to see that it has a 27.0 Mhash/s rating. Assuming the conversion from Mhash to Khash is the same from megabytes to kilobytes, I could easily see that got.... 27,000 Khash??? that doesn't see right because that means according to this calculator that would mean I was making over 15 litecoin per day. I seriously doubt it is this easy to produce these. I'm sorry for sounding like an idiot. It's late and I probably am off by a couple zeros.
I have a 2gb GDDR5 Nvida GT650M. I looked at a list of graphics cards and their respective performance in mining, but I couldn't make any sense of it. Here's the list. If it is good enough, what client should I use, and would a pool be better than solo mining?
Imagine a user that is coming from the Bitcoin world. The user is aware of Bitcoin basics and how it works in general. They've recently heard about Litecoin and would like to learn more about it and try it for themselves. The user is trying to decide if Litecoins are a viable option. This user might continue to use Bitcoin but with their newly gained knowledge they have the option of using Litecoin for any transactions they deem appropriate. What kind of information would be useful for their transition as an end user? ....... Here are some starters: 1) Comparison between Litecoin and Bitcoin
Start with reading the wiki page about the differences between Litecoin and Bitcoin
2) Buying Litecoins
The easiest way to buy Litecoins is to purchase Bitcoins then use an online exchange to convert to Litecoins.
I'm not aware of any lightweight clients that are available for Litecoin. I've seen there was an Electrum client but it looked out of date and the last time I ran it there were no servers to connect to. For now the safest option is to use the official Litecoin-Qt client and this requires downloading the whole blockchain. The user could also try a paper wallet for additional security. There are a few online wallets, but those are always risky.
5) Armory client is not available
The Amrory client isn't compatible with Litecoin. At one point there was a bounty to add support for Litecoin but I don't know if any progress was made. For now the only client that is recommended is the Official Litecoin client downloaded from litecoin.org
6) The transaction fees are calculated differently between Litecoin and Bitcoin.
I don't know the specifics but this is one of the cases where Litecoin starts to diverge from Bitcoin. The minimum transaction fee for Litecoin is .02 LTC compared to the minimum transaction fee for Bitcoin of .0005 BTC.
7) You cannot use Bitcoin ASIC mining hardware to mine Litecoins
Bitcoin ASICs are built for SHA256 hashing. Litecoin uses Scrypt so it isn't simply plug and play. The ASIC hardware would require additional memory among other things.
8) Mining Litecoins using CPUs isn't cost effective
Mining efficiently requires some investment in equipment, namely several high end graphic cards. Your old laptop will not be able to mine a single LTC in months, because the network is mature and thus the mining difficulty is high.
9) Mining Litecoins using your laptop is not recommended
You can easily damage your laptop and cause it to overheat
10) Accepting Litecoin as payment on your website
Use services such as Kojn (beta invite required) and LTCPP (soon) for your Litecoin payment processing needs.
11) Block explorers
There are currently no block explorers with built in wallet functionality like blockchain.info.
Head over to LTC-GLOBAL to purchase and trade stocks and bonds in exchange for your Litecoins.
A warning message copied directly from the site "This site is currently in beta. Nothing is verified. Everything is virtual. Do your homework. Watch out for scams. Be diligent."
TLDR; A user knows about Bitcoin and is starting into Litecoin. Any useful information for the transition? EDIT: making edits as useful information is posted. Input has been provided by: lastgen, -Mahn
I've been reading extensively on bitcoin for the past few days and there are a few holes here and there that I'm trying to understand. So far it makes me believe that this whole mining thing is some sort of elaborated scam. Here is a few unorganized points that are confusing to me. Why are other currencies like litecoin slowly becoming popular and why people want them to become popular? What purpose do litecoin serves that bitcoin doesn't? If the second to bitcoin, a redundant currency like litecoin becomes popular and that people want it to become popular, then what's stop more of those currencies to all becoming popular making each of them just spammy/redundant at the end, don't we only need one of those currencies to serve the purpose of worldwide decentered transactions? The so popular and referred mining hardware comparison sheet gives a list of videocard that are recommended to use. Combined with this calculator people can make some calculations to see if they should invest electricity cost into mining. Now it seems to yield a little free income at the end of the month, all seems well until you investigate further. The power consumption of your videocard shown there is, in the radeon 6850 case, only half of what it truly use at full usage. Now to add to this, I've been mining for more than a day at full power without stopping nor interfering in the process. It tells me I should be making 0.0125 bitcoin a day but I barely made half of that in a bit more than a day, yet I am positive the videocard ran at full strength for the whole process. Now, double electricity cost vs half production, it becomes almost a profitless operation. To this, combined that the current bitcoin value is tenfold what it was 3 months ago, how could it have been profitable back then if it is not right now? Now another suspicious part to me is those 2 websites 12. They offer what every person would ever want, a way to make a lot of money easily. Both of them deliver their products months after purchase and, the 2nd website especially, is selling something that would potentially pays for itself back in less than a month, after which huge profit would come in. How convenient, to sell something that yields huge profit and pays itself back so quickly, better sell than use ourselves right? The first site has sold out, and funnily enough are selling the next batch for 75 bitcoin per... which they could just mine themselves faster than their delivery time, so what's their gain really? Conveniently we have the 2nd website, not sold out, selling something similar to the 1st website, without any pictures of what the behind of their miner looks like, who won't mention anywhere the power consumption of their product but say that it comes with a usb cord, plug and play! That sure not sound fishy at all since the asic counterpart is shown on the comparison sheet as using 600 W, for sure the usb connector hole can output that kind of power right? Hopefully someone can shed some light on all this to the better understanding of least common asked matters, yet quite important for anyone who wants to jump in this... bandwagon... I'm legitimately trying to see things optimistically but so far I only see a few root members trying to scam the entire world by projecting this half legit currency world unto us. Note: sorry for my relatively poor english, I tried putting my thoughts into word as precisely as I could, but I couldn't do it as well as I wish I could.
Sorry if this is obvious, I recently became interested in btc (and trading it like any other currency). The exchanges I am aware of are Mt.Gox, Btc-E, and Bitstamp. I tend to ramble, so i'll try to keep this short:
Is any other exchanges I should look at? Are any of these better than the others? (I know you all don't like Mt.Gox very much as of now)
Is there a way to look up lag time for each exchange? Is it usually minutes or seconds?
I noticed each exchange has different prices. at the time of writing, Mt.G is at 90, Btc-E is at 92, and Bitstamp is at 88. These numbers were farther apart right after the crash. Will these slowly all approach the same number? Do people make money buying on one exchange and selling on another?
Do you have any general advice for buying selling bitcoin?
I'm planning on using Btc-E and buying bitcoins using bitinstant, pushing it to my Qt wallet, and transferring it to Btc-E from there. Is there a more efficient way to push cash to Btc-E? I'm only going to push 20USD for now, just as an experiment.
Is there any risk with holding USD in Btc-E? Is there something safer?
A few days ago, prices were fluxating between 60 and 90 (by the half hour). Now it's pretty stable. Is this good for the strength of bitcoin? Will people take it more seriously?
NEWQUESTION Is there any way to get bitcoin historical prices in csv (or any other easily parsed format)(from any exchange) like the yahoo's ichart csv (download)? NEWQUESTION Will the influx of these new ASICS affect the price of bitcoins? up? down?
Mining: My hardware is a 5770 graphics card which from bitcoin.it says I'll get about 200 MH/s. I'm hoping to make at least a buck a day (mining in a pool), again more just as an experiment.
If I use linux, and don't run a DE, could I increase my MH/s? Would it be noticeable?
What are your thoughts on ASICS? Is it thought that butterflylabs may be a scam? How about Avalon? Would buying one on ebay result in an unhappy scammed DrWoollyNipples?
Is there any information on how these ASICS work? I'm very curious in the technology and any links would be appreciated.
Thanks, you all are amazing. Sorry for so many questions.
See also: Non-specialized hardware comparison Below are statistics about the Bitcoin Mining performance of ASIC hardware and only includes specialized equipment that has been shipped.. GPUs, CPUs and other hardware not specifically designed for Bitcoin mining can be found in the Non-specialized_hardware_comparison.. Notes: XEVAN is a hashing algorithm for mining cryptocurrency developed as a unique combination of dual X17 algorithm with a 128 bit headers. Xevan algorithm was founded and used by BitSend (BSD). Later, developers of some other cryptocurrencies started to implement Xevan algo as it refers to be ASIC-resistant, energy efficient and stable algorithm. A community dedicated to Bitcoin, the currency of the Internet. Bitcoin is a distributed, worldwide, decentralized digital money. Bitcoins are issued and managed without any central authority whatsoever: there is no government, company, or bank in charge of Bitcoin. These are the most current and accurate listings for GPU hash rates. When you are mining, the same values that apply for Bitcoin mining will apply to any SHA-256 coin mining. The same is true for Litecoin and all other Scrypt based coins. We currently like the ATI HD 7950 card’s. Some Bitcoin users might wonder why there is a huge disparity between the mining output of a CPU versus a GPU. First, just to clarify, the CPU, or central processing unit, is the part of the computer that performs the will of the software loaded on the computer. It's the main executive for the entire machine. It is the master that tells all the parts of the computer what to do - in accordance ...
GPU Mining Profitability In April 2019 -GPU Mining 2019
#bitcoin #cryptocurrency #cryptocurrencynews So is GPU mining worth it or profitable in April 2019? Check out the latest gpu mining news and profitability. Crypto mining coins like ethereum ... For more information: https://www.bitcoinmining.com and https://www.weusecoins.com What is Bitcoin Mining? Have you ever wondered how Bitcoin is generated? T... Today, we compare the two major categories of crypto mining: general purpose, using computers and graphics cards, and ASIC mining, using specialized hardware. In this video, we compare these from ... Easy step by step video for Mine-Litecoin.com so that you may set up your CGMiner to run in the pool. I have also included the Mining Hardware Comparison for... Some Helpful Links: • Buy Parts for a Mining Rig: http://amzn.to/2jSSsCz • Download NiceHash Miner: https://www.nicehash.com/?p=nhmintro • Choose a Wallet: h...