Warning: SQLite3::querySingle(): Unable to prepare statement: 1, no such table: sites in /home/admin/web/local.example.com/public_html/index.php on line 46
 driver for AMD Radeon GPUs (Download) - CRYPTO-MINING

driver for AMD Radeon GPUs (Download) - CRYPTO-MINING

XFX ATI Radeon HD 7950 (3072 MB) (FX795ATDBC) Graphics Card is for sale on cryptothrift.com for Bitcoin and Litecoin https://cryptothrift.com/auctions/crypto-mining-gpu/xfx-ati-radeon-hd-7950-3072-mb-fx795atdbc-graphics-card-2/

XFX ATI Radeon HD 7950 (3072 MB) (FX795ATDBC) Graphics Card is for sale on cryptothrift.com for Bitcoin and Litecoin https://cryptothrift.com/auctions/crypto-mining-gpu/xfx-ati-radeon-hd-7950-3072-mb-fx795atdbc-graphics-card-2/ submitted by duetschpire to cryptothrift [link] [comments]

Switched from a 1050 GT to a 5700 XT, it's great!

Edit: Resubmitted, original post removed due to lack of flair.
After 11 years I returned to an AMD GPU. I bought a Sapphire Pulse Radeon 5700 XT about three weeks ago once the non-blower cards weren't selling out in minutes. Price at the time was $409.99.
https://ibb.co/J7DvnYr
Some initial thoughts:
I didn't realize the power draw at idle would be so low (using Global Wattman) at 9W. I didn't know that the dual fans would actually shut off at low usage and make the card silent. It's much quieter than my old card, an EVGA 1050 Gaming, but I assume that's because it's fan is much smaller. I haven't stressed the card too much, but at least I now have something I can push for the next few years. I can't wait to get a 4k monitor and see what the card can really do.
I decided to do a look back on my previous video cards:
ATI 9800PRO All In Wonder (January 2005) - $245 - To pair with my NEC 19 inch CRT monitor. I could record TV on my computer, which I thought was amazing at the time (also coincidentally a month before YouTube was founded).
EVGA GeForce 8800GTS (G92) (April 2008) - $280 - Much quicker than the 9800PRO, used it for a very long time, but it was really slow towards the end.
EVGA 1050 Gaming (November 2017) - $105 - I got this in the middle of the bitcoin mining crisis when every card was at or over MSRP. I thought this would just be a filler card until prices came down, but that turned into about 2 years. Worked ok, but some newer games were a struggle.
I do have one question. Now that I have a video card that can support 4k, do you have any suggestions on monitors? I'm currently running dual Acer monitors (24 and 21 inches), but I was thinking of dual 27 inch 4k monitors. Can the 5700 XT handle this? I'm not sure if I absolutely have to have FreeSync, and in the past I've liked Dell's monitors (currently looking at the U2718Q). Thanks!
submitted by 3blue to Amd [link] [comments]

So I finally gave Honeyminer a try. (my personal semi-review)

This review was last updated 11-30-18
When I first was interested in trying this program I couldn't find anything about it. it seems a lot of people were too scared to try it since their is like no information about it other then from the web page itself. to be honest I was a bit scared to try it. I've tried many other software of this kind, on a "test" machine I'm not afraid to lose on a secondary network and router... incase its a scam or gonna give me a virus and I suggest anyone installing mining software do the same as a rule of thumb. please keep in mind the software is still relatively new and they are working to improve it still. They seem to be hiring as well if your interested in helping them grow by working for them look near the bottom for their contact e-mail. ____________________________________________________________________________________________________
This review is for the windows version of Honyminer Because its still relatively new I knew could go one of two ways "sacm software" like most every mobile mining app or even quite a few desktop ones - Or legit. I'm glad to say after using it for a month it seems legit. I was able to withdraw from it no problem. If your system is really crappy It might not work that well on your computer or mining rig. There are no ads and the program doesn't seem to disrupt any day to day activity at least not on my main system, however you can of course expect increased heat production of your system as with any mining software, adequate cooling is important in mining. Anyways Honyminer is as close to an easy one click mining software as I have come. they seem to be making a "pro" version too for more hardcore miners. They do take a fee which is to be expected *look near the bottom for fee information\* but that fee goes down significantly if you have multiple GPU's mining.. The good thing about it for me was it let me kind of set my rig to "autopilot" so to speak. If you wish to see the H/s numbers in real time, go to you settings and view the "expert logs" which will also tell what coin is being mined at the time ____________________________________________________________________________________________________________
Pros
Pro and or con (depending on how you look at it)
Cons:
_________________________________________________________________________________________________
COMPATIBILITY: (sorry it keeps adding asterisks to the card model for no reason)
WORKED ON: every nvidia card tested so far with card models dating back from 20014 to now..
Worked on some surprising low end and or old CPU and GPUs. like the
AMD Radeon R9 380 card in addition to a AMD Athlon II X3 450 Processor and it mines just fine.. of course that processor doesn't make much on its own lol.. but thats an extra 2 or 3 cents per day by itself. I've also tested it with an i3, i2 Most AMD cards worked but I ran into issues with a few so maybe it's easier for me to just tell you what did not work.
DID NOT WORK ON:
--- any of the AMD ATI Radeon HD 4250's tested so far (2) that particular card It didn't work at all for mining like never enabled the gpu but the cpu on that machine did work however it would generate an "error" on start up but otherwise did not disrupt the mining on that system except if I turned on idle earning mode, I would get a bunch of errors as it was trying to access the GPU. we need the functionality to enable or disable hardware individually I think. (errors or no errors it just seems like a good thing to have.)
OR a system that had both a AMD Radeon R7 Graphics and a AMD A8-7650K Radeon R7, (4C+6G) which surprised me considering some of the things that did work lol... but I think it might just might be that one system, but either way can't vouch that it will work. That system was pre-built and wont allow the parts to be changed or easily removed to be worth the effort since I have to use it for other things so unfortunately I can't test these on another mainboard at least not with wasting some time, money and patients that Id rather dedicate elsewhere for now.
I had some issues using one RX Vega 56 card but i think it's was just that card because another one did work just fine.________________________________________________________________________
FEES W/ comparison to nicehash
I'm not sure if this post will be helpful to anyone looking into this software or anyone whos looking to try a different mining software but if it dose great.
-- nicehash charges the following fees as far as "selling/mining" or withdrawing.
Payouts for balances less than 0.1 to external wallet 5%
Payouts for balances greater than or equal to 0.1 BTC to external wallet 3%
Payouts for balances greater than or equal to 0.001 BTC to NiceHash wallet 2%
Withdrawal fees from NiceHash wallet
Withdrawals from NiceHash wallet are subjected to the withdrawal fee, which depends on the withdrawn amount and withdrawal option.
WITHDRAWAL OPTION AMOUNT TO WITHDRAW FEE Any BTC wallet From 0.002 (min) to 0.05 BTC 0.0001 BTC
Any BTC wallet More than 0.05 BTC 0.2% of withdrawn amount
Coinbase More than 0.001 BTC FREE - No fee. but they also say Minimum Coinbase withdrawal limit is adjusted dynamically according to the API overload._____________________________________________________________________________
honyminer fees are based on number of GPU's working.
8% for 1 GPU or for 2 GPUs or more the fee is 2.5%.
The only withdrawal fee is the standard BTC transaction fee that bitcoin charges and it doesn't go to honyminer. When they add the other withdrawal functions that fee cam be avoided I suppose.
_________________________
Earnings: in comparison to nicehash
Update: sometimes software / test networks will give a view that can be off + or - a few percent compared to actual. A lot of different things can affect your earnings including where you are located in the world, I'm not sure how many of you uses more than one mining software day to day , ISP issues, crypto price fluctuation, updates to fee's, and inaccuracies in test software/networks can affect results. but I go back and forth between different ones from time to time and I think that's good practice to keep options open. I notice that honey miner seems to do better for me at night-time and early morning/afternoon is when it has the most trouble raking in the crypto's
That said I've been trying to test to see how this compares to nice hash earnings, with two of my buddies. So this is an average between the 3 of our profits vs loss compared to nice hash, I'm using a two 10 GPU/ 3 cpu setups, while one of my buddies is using two 1 gpu, 2 cpu setups and the other is using two 30 gpu mini farm's. We each have 2 networks each located relatively close by *less than .5 mile the furthest one* one with honyminer running and the other with nice hash and we are looking over 24 hour periods When all three of us have the results for one day, we average our results together. In all we will be looking over a 14 day period. UPDATE: the results below were done well long before the latest update to the software so I do not know if they have changed, Id have to do another round or perhaps some from the community could give me their results and save me a bit of work. I'm not sure when Id have the time to dig into it again. Sorry that it took me so long before I could get on here to post the results of the last few days of the tests.
Seem to be a bit smaller then nicehash at times and higher at other times. it seems to for me at least payquicker and it gets deposited in my nicehash account sooner than I expected.
hopefully when they let up pick which coin to mine on our own it may help somewhat, and any of you who want to move smaller volume will probably benefit when they add the functionality to withdraw other coin/usd.
anyways when their autopilot system works it works great but when it doesn't it's just "okay" for lack of a better word...
_____________________________________________________
Contact: they have a contact us part on their webpage and they also have a reddit page which I was made aware of from contacting them https://www.reddit.com/HoneyMine
Careers: If anyone is interested in working for them the job listings at the time of this typing were for Senior Java Developer(s) and Customer Service Representative(s) the email listed is [careers@honeyminer.com](mailto:careers@honeyminer.com). id suggest you check their site for the requirements I just added this part to the review as a courtesy if anyone's interested its not meant to be a focus of it. But I know we have some really talented people on reddit who care about the crypto world passionately so id rather give honyminer a chance to have some of those sort on their team since it might help improve the software faster for the end users.. if that makes sense.
_________________________________________________________
UPDATE: If a question reminds me I left out something I think should have mentioned Ill try to add it here so ppl don't have to scroll all over the place.. I don't write many reviews (for anything) so I don't know if this one was any good or not but I hope it was okay.. and I'm still a new reddit user relatively. I just wanted to make this review mainly because there is next to no information on honyminer when I looked for it and maybe it can help anyone whos interested in it.
browolf2 asked Is it basically like nicehash then? :
A: In a way, its like nice hash that its cloud based, but you get paid not just when your pool completes an order. there are no "buyers" only "sellers" if you look at it that way...I hope I'm wording this the right way.. It's just straight up mining and they take their fee but compared to nicehash the fees for "mining" are different
karl0525 asked: do you know if we can contact the honeyminer dev team and see if they will communicate here on Reddit. Might give them some good ideas what us miners are looking for? Worth a try maybe? Thanks:
A: I submitted a question to their "contact us" part of their webpage and I got a reply from them, this is the message I received below:
Thank you for writing in and for your interest in Honeyminer. We always welcome feedback and suggestions from our users. We are currently planning on expanding our online and social media presence.
Please check our our Reddit page: https://www.reddit.com/HoneyMine
submitted by Joe_Cow to gpumining [link] [comments]

NVIDIA Geforce GTX 280 support question.

Here is a link to the card in question:
http://imgur.com/gallery/5DikS15
I built a PC specifically for mining bitcoin with nicehash, it has an AMD phenom II processor and an ATI Radeon graphics card, but it also has an NVIDIA Geforce GTX 280, and I was wondering if the GTX 280 is supported by nicehash. If it isn't, can there be a version that can be released that supports the GPU in question? Thanks!
UPDATE: I said in this post that I have an NVIDIA Geforce GTX 280. NOT an AMD video card or anything like that (that I plan on mining with). So PLEASE STOP ASKING! Also the point of this thread was to ask if it was compatible. I don't care if the card isn't profitable anymore. I just want to know if it is compatible. I did not know that there was so much unneeded controversy over a person just wanting to bitcoin mine with older hardware. Please stop commenting about it not being profitable anymore. That's all. Have a nice day.

submitted by Killerpokemon11 to NiceHash [link] [comments]

Crypto Mining for Beginners. Is it really worth it?

Crypto Mining for Beginners. Is it really worth it?

Image from blokt.com
Mining cryptocoins is an arms race that rewards early adopters. You might have heard of Bitcoin, the first decentralized cryptocurrency that was released in early 2009. Similar digital currencies have crept into the worldwide market since then, including a spin-off from Bitcoin called Bitcoin Cash. You can get in on the cryptocurrency rush if you take the time to learn the basics properly.

Which Alt-Coins Should Be Mined?


Image from btcwarp.com
If you had started mining Bitcoins back in 2009, you could have earned thousands of dollars by now. At the same time, there are plenty of ways you could have lost money, too. Bitcoins are not a good choice for beginning miners who work on a small scale. The current up-front investment and maintenance costs, not to mention the sheer mathematical difficulty of the process, just doesn't make it profitable for consumer-level hardware. Now, Bitcoin mining is reserved for large-scale operations only.
Litecoins, Dogecoins, and Feathercoins, on the other hand, are three Scrypt-based cryptocurrencies that are the best cost-benefit for beginners.
Dogecoins and Feathercoins would yield slightly less profit with the same mining hardware but are becoming more popular daily. Peercoins, too, can also be a reasonably decent return on your investment of time and energy.
As more people join the cryptocoin rush, your choice could get more difficult to mine because more expensive hardware will be required to discover coins. You will be forced to either invest heavily if you want to stay mining that coin, or you will want to take your earnings and switch to an easier cryptocoin. Understanding the top 3 bitcoin mining methods is probably where you need to begin; this article focuses on mining "scrypt" coins.
Also, be sure you are in a country where bitcoins and bitcoin mining is legal.

Is It Worth It to Mine Cryptocoins?

As a hobby venture, yes, cryptocoin mining can generate a small income of perhaps a dollar or two per day. In particular, the digital currencies mentioned above are very accessible for regular people to mine, and a person can recoup $1000 in hardware costs in about 18-24 months.
As a second income, no, cryptocoin mining is not a reliable way to make substantial money for most people. The profit from mining cryptocoins only becomes significant when someone is willing to invest $3000-$5000 in up-front hardware costs, at which time you could potentially earn $50 per day or more.

Set Reosonable Expectations

If your objective is to earn substantial money as a second income, then you are better off purchasing cryptocoins with cash instead of mining them, and then tucking them away in the hopes that they will jump in value like gold or silver bullion. If your objective is to make a few digital bucks and spend them somehow, then you just might have a slow way to do that with mining.
Smart miners need to keep electricity costs to under $0.11 per kilowatt-hour; mining with 4 GPU video cards can net you around $8.00 to $10.00 per day (depending upon the cryptocurrency you choose), or around $250-$300 per month.
The two catches are:
1) The up-front investment in purchasing 4 ASIC processors or 4 AMD Radeon graphic processing units
2) The market value of cryptocoins
Now, there is a small chance that your chosen digital currency will jump in value alongside Bitcoin at some point. Then, possibly, you could find yourself sitting on thousands of dollars in cryptocoins. The emphasis here is on "small chance," with small meaning "slightly better than winning the lottery."
If you do decide to try cryptocoin mining, definitely do so as a hobby with a very small income return. Think of it as "gathering gold dust" instead of collecting actual gold nuggets. And always, always, do your research to avoid a scam currency.

How Cryptocoin Mining Works

Let's focus on mining scrypt coins, namely Litecoins, Dogecoins, or Feathercoins. The whole focus of mining is to accomplish three things:
- Provide bookkeeping services to the coin network. Mining is essentially 24/7 computer accounting called "verifying transactions."
- Get paid a small reward for your accounting services by receiving fractions of coins every couple of days.
- Keep your personal costs down, including electricity and hardware.

The Laundry List: What You Will Need to Mine Cryptocoins


https://preview.redd.it/gx65tcz0ncg31.jpg?width=1280&format=pjpg&auto=webp&s=f99b79d0ff96fe7d529dc20d52964b46306fb070
You will need ten things to mine Litecoins, Dogecoins, and/or Feathercoins.
1) A free private database called a coin wallet. This is a password-protected container that stores your earnings and keeps a network-wide ledger of transactions.
2) A free mining software package, like this one from AMD, typically made up of cgminer and stratum.
3) A membership in an online mining pool, which is a community of miners who combine their computers to increase profitability and income stability.
4) Membership at an online currency exchange, where you can exchange your virtual coins for conventional cash, and vice versa.
5) A reliable full-time internet connection, ideally 2 megabits per second or faster speed.
6) A hardware setup location in your basement or other cool and air-conditioned space.
7) A desktop or custom-built computer designed for mining. Yes, you may use your current computer to start, but you won't be able to use the computer while the miner is running. A separate dedicated computer is ideal. Do not use a laptop, gaming console or handheld device to mine. These devices just are not effective enough to generate income.
8) An ATI graphics processing unit (GPU) or a specialized processing device called a mining ASIC chip. The cost will be anywhere from $90 used to $3000 new for each GPU or ASIC chip. The GPU or ASIC will be the workhorse of providing the accounting services and mining work.
10) A house fan to blow cool air across your mining computer. Mining generates substantial heat, and cooling the hardware is critical for your success.
11) You absolutely need a strong appetite of personal curiosity for reading and constant learning, as there are ongoing technology changes and new techniques for optimizing coin mining results. The most successful coin miners spend hours every week studying the best ways to adjust and improve their coin mining performance.

Original Blog Post: https://www.lifewire.com/cryptocoin-mining-for-beginners-2483064
submitted by Tokenberry to NewbieZone [link] [comments]

such beginner shibe thread wow how to get coin

 how to shibecoin v rich in minutes much instruct so simple any doge can do 

START HERE

UPDATE 1/21/14: I'm not updating this guide anymore. Most of the steps should still work though. See the wiki or check the sidebar for updated instructions.
Before you do anything else, you need to get a wallet. Until there's a secure online wallet, this means you need to download the dogecoin client.
Now open the client you just downloaded. You'll be given a default address automatically, and it should connect to peers and start downloading the dogechain (aka blockchain in formal speak). You'll know because there will be a progress bar at the bottom and at the lower right there should be a signal strength icon (TODO: add screenshots).
If you've waited 2 or 3 minutes and nothing is happening, copy this:
maxconnections=100 addnode=95.85.29.144 addnode=162.243.113.110 addnode=146.185.181.114 addnode=188.165.19.28 addnode=166.78.155.36 addnode=doge.scryptpools.com addnode=doge.netcodepool.org addnode=doge.pool.webxass.de addnode=doge.cryptopool.it addnode=pool.testserverino.de addnode=doge.luckyminers.com addnode=doge.cryptovalley.com addnode=miner.coinedup.comdoge addnode=doge.cryptoculture.net addnode=dogepool.pw addnode=doge.gentoomen.org addnode=doge.cryptominer.net addnode=67.205.20.10 addnode=162.243.113.110 addnode=78.46.57.132 
And paste it into a new text file called dogecoin.conf, which you then place into the dogecoin app directory.
Now restart your qt client and the blockchain should start downloading in about 1-2 minutes.
Once it finished downloading, you're ready to send and receive Dogecoins!

GETTING COINS

Decide how you want to get Dogecoin. Your options are:
I'll go into detail about each of these. I'm currently writing this out. I'll make edits as I add sections. Suggestions are welcome.

MINING

Mining is how new dogecoins are created. If you're new to crypto currencies, read this. To mine (also called "digging"), a computer with a decent GPU (graphics card) is recommended. You can also mine with your CPU, but it's not as efficient.

GPU MINING

These instructions cover only Windows for now. To mine, you'll need to figure out what GPU you have. It'll be either AMD/ATI or Nvidia. The setup for both is approximately the same.

Step One: Choose a pool

There's a list of pools on the wiki. For now it doesn't really matter which one you choose. You can easily switch later.
NOTE: You can mine in two ways. Solo mining is where you mine by yourself. When you find a block you get all the reward. Pool mining is when you team up with other miners to work on the same block together. This makes it more likely that you'll find a block, but you won't get all of it, you'll have to split it up with others according to your share of the work. Pool mining is recommended because it gives you frequent payouts, because you find more blocks. The larger the pool you join, the more frequent the payouts, but the smaller the reward you get.
Over a long period of time the difference between pool and solo mining goes away, but if you solo mine it might be months before you get any coins.

Step two: Set up pool account

The pool you chose should have a getting started page. Read it and follow the instructions. Instructions vary but the general idea is:
When you're done with this, you'll need to know:

Step three: Download mining software

For best performance you'll need the right mining software.
Unzip the download anywhere you want.

Step four: Set up miner

Create a text file in the same folder as your miner application. Inside, put the command you'll be running (remove brackets).
For AMD it's cgminer.exe --scrypt -o stratum+tcp://: -u -p
For Nvidia it's cudaminer.exe -o stratum+tcp://: -O :
Substitute the right stuff in for the placeholders. Then on the next line of the text file type pause. This will let you see any errors that you get. Then save the file with any name you want, as long as the file extension is .bat. For example mine_serverName.bat.

Step five: Launch your miner

Just open the .bat file and a command line window should pop up, letting you know that the miner is starting. Once it starts, it should print out your hash rate.
If you now go to the pool website, the dashboard should start showing your hashrate. At first it'll be lower than what it says in the miner, but that's because the dashboard is taking a 5 minute average. It'll catch up soon enough.
NOTE: A normal hashrate is between 50 Kh/s up to even 1 Mh/s depending on your GPU.

You're now mining Dogecoins

That's it, nothing more to it.

CPU MINING

CPU mining isn't really recommended, because you'll be spending a lot on more on power than you'd make from mining Dogecoin. You could better spend that money on buying Dogecoin by trading. But if you have free electricity and want to try it out, check out this informative forum post.

Trading

Trading has been difficult so far, but Dogecoin just got added to a few new exchanges. If you don't have a giant mining rig, this is probably the best way to get 100k or more dogecoins at the moment. I'll write up a more complete guide, but for now check out these sites:

Faucets

Faucets are sites that give out free coins. Usually a site will give out somewhere between 1 and 100 Dogecoin. Every site has its own time limits, but usually you can only receive coins once every few hours, or in some cases, days. It's a great way to get started. All you do is copy your address from the receive section of your wallet and enter it on some faucet sites. Check out /dogecoinfaucets for more. If you go to each site on there you might end up with a couple hundred Dogecoin!

Begging

This method is pretty straightforward. Post your receiving address, and ask for some coins. Such poor shibe. The only catch is, don't do it here! Please go to /dogecoinbeg.

Tips

At the moment there are two tip bots:
Other redditors can give you Dogecoin by summoning the tip bot, something like this:
+dogetipbot 5 doge
This might happen if you make a good post, or someone just wants to give out some coins. Once you receive a tip you have to accept it in a few days or else it'll get returned. Do this by following the instructions on the message you receive in your inbox. You reply to the bot with "+accept". Commands go in the message body. Once you do that, the bot will create a tipping address for you, and you can use the links in the message you receive to see your info, withdraw coins to your dogecoin-qt wallet, see your history, and a bunch of other stuff.
As a bonus, so_doge_tip has a feature where you can get some Dogecoins to start with in exchange for how much karma you have. To do this, send the message "+redeem DOGE" to so_doge_tip. You'll need to create a tipping account if you don't have one.
If you want to create a tipping account without ever being tipped first, message either of the bots with "+register" and an address will be created for you.

CHANGELOG

  • 1/21/14 - Added note about this thread no longer being updated
  • 1/21/14 - Changed wallet links to official site
  • 12/27/13 - Added 1.3 wallet-qt links
  • 12/21/13 - Added new windows 1.2 wallet link
  • 12/20/13 - Fixed +redeem text
  • 12/18/13 - Added short blurb on trading.
  • 12/18/13 - Updated cudaminer to new version (cudaminer-2013-12-18.zip).
  • 12/18/13 - Fixed +redeem link
  • 12/18/13 - Updates dogecoin.conf, from here.
  • 12/17/13 - Linked to mining explanation.
  • 12/17/13 - Added link to CPU mining tutorial, in response to this.
  • 12/16/13 - Added links to tip commands, link to dogetipbot wiki.
  • 12/16/13 - Note about tip commands going in body, in response to this.
  • 12/16/13 - Added link to cgminer mirror, thanks to scubasteve812 and thanks to Bagrisham.
  • 12/16/13 - Note about removing brackets in response to this.
  • 12/15/13 - Fixed hash rate as per this comment, thanks lleti
  • 12/15/13 - Added info for all other ways of getting money, except for trading (placeholder for now)
  • 12/15/13 - Added windows GPU mining instructions 12/15/13 - Added wallet instructions, list of how to get money
submitted by lego-banana to dogecoin [link] [comments]

Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.

I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom.
…Only problem: much of what they say is wrong.
There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other.
Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.

“PCs can use TVs and monitors.”

This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up.
I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080.
I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.

“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."

Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC.
Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go!
Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered.
Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy!
Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way.
Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.

“On PC you could use Steam Link to play anywhere in your house and share games with others.”

PS4 Remote play app on PC/Mac, PSTV, and PS Vita.
PS Family Sharing.
Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console.
In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system).
PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game.
Need I say more?

“Gaming is more expensive on console.”

Part one, the Software
This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks.
Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
So does this mean you have to pay full retail for this racing experience? Nope, because disk prices.
Just Cause 3, an insane open-world experience that could essentially be summed up as “break stuff, screw physics.” And it’s a good example of where the Steam price is lower than PSN and XBL:
Not by much, but still cheaper on Steam, so cheaper on PC… Until you look at the disk prices.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new.
Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount.
Part 2: the Subscription
Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right?
Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly.
Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee.
Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts.
Let’s look at PS Plus for a minute: for $60 per year, you get:
  • 2 free PS4 games, every month
  • 2 free PS3 games, every month
  • 1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
  • Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
  • access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72 free games every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month.
In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still.
All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts.
Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst.
Part 3, the Systems
  • Xbox and PS2: $299
  • Xbox 360 and PS3: $299 and $499, respectively
  • Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off.
Well, keep in mind that the generations here aren’t short.
The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total.
And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention.
Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware.
Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually.
Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines).
Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway.
Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.

“PC is leading the VR—“

Let me stop you right there.
If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold.
Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone.
If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC.
Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR.
…Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.

“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”

This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam?
GTA V
  • CPU: Intel Core 2 Quad CPU Q6600 @ 2.40GHz (4 CPUs) / AMD Phenom 9850 Quad-Core Processor (4 CPUs) @ 2.5GHz
  • Memory: 4 GB RAM
  • GPU: NVIDIA 9800 GT 1GB / AMD HD 4870 1GB (DX 10, 10.1, 11)
Just Cause 3
  • CPU: Intel Core i5-2500k, 3.3GHz / AMD Phenom II X6 1075T 3GHz
  • Memory: 8 GB RAM
  • GPU: NVIDIA GeForce GTX 670 (2GB) / AMD Radeon HD 7870 (2GB)
Fallout 4
  • CPU: Intel Core i5-2300 2.8 GHz/AMD Phenom II X4 945 3.0 GHz or equivalent
  • Memory: 8 GB RAM
  • GPU: NVIDIA GTX 550 Ti 2GB/AMD Radeon HD 7870 2GB or equivalent
Overwatch
  • CPU: Intel Core i3 or AMD Phenom™ X3 8650
  • Memory: 4 GB RAM
  • GPU: NVIDIA® GeForce® GTX 460, ATI Radeon™ HD 4850, or Intel® HD Graphics 4400
Witcher 3
  • Processor: Intel CPU Core i5-2500K 3.3GHz / AMD CPU Phenom II X4 940
  • Memory: 6 GB RAM
  • Graphics: Nvidia GPU GeForce GTX 660 / AMD GPU Radeon HD 7870
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis.
But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right?
No. Not even close.
iRacing
  • CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
  • Memory: 8 GB RAM
  • GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
Playerunknown’s Battlegrounds
  • CPU: Intel Core i3-4340 / AMD FX-6300
  • Memory: 6 GB RAM
  • GPU: nVidia GeForce GTX 660 2GB / AMD Radeon HD 7850 2GB
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games.
Subnautica
  • CPU: Intel Haswell 2 cores / 4 threads @ 2.5Ghz or equivalent
  • Memory: 4GB
  • GPU: Intel HD 4600 or equivalent - This includes most GPUs scoring greater than 950pts in the 3DMark Fire Strike benchmark
Rust
  • CPU: 2 ghz
  • Memory: 8 GB RAM
  • DirectX: Version 11 (they don’t even list a GPU)
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting?
Low-end PCs.
What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers.
Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars.
I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:

“PCs are more powerful, gaming on PC provides a better experience.”

This one isn’t so much of a misconception as it is… misleading.
Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4 Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners).
Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle.
These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up.
Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that.
Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance.
Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X.
Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…

“You pay a little more for a PC, you get much more quality.”

The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time.
For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
  • 1.8 TFLOP
  • 1.35 GHz base clock
  • 2 GB VRAM
  • $110
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs.
Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
  • 2.1 TFLOP
  • 1.29 GHz base clock
  • 4 GB VRAM
  • $140 retail
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part.
But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance.
The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
  • 3.0 TFLOP
  • 1.5 GHz base clock
  • 3 GB VRAM
  • $200 retail
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much.
Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story!
Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
  • 3.9 TFLOP
  • 1.5 GHz base clock
  • 6 GB VRAM
  • $250 retail
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story.
I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99.
Well, let’s see what Tech Power Up has to say...
94.3 fps. 74% increase. Huh.
Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
  • 9.0 TFLOP
  • 1.6 GHz base clock
  • 8 GB VRAM
  • $500 retail
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world?
Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story.
You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option.
In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X.
On another note, let’s look at a PS4 Slim…
  • 1.84 TFLOP
  • 800 MHz base clock
  • 8 GB VRAM
  • $300 retail
…Versus a PS4 Pro.
  • 4.2 TFLOP
  • 911 MHz base clock
  • 8 GB VRAM
  • $400 retail
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here.
It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games.
…That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7.
The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.

“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”

Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team.
This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough.
On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder.
Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them.
Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion.
Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.

“There are more PC gamers.”

The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million.
Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent.
For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales.
But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million.
This isn’t uncommon, by the way.
Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total.
EDIT: There were other examples but... Reddit has a 40,000-character limit.

"Modding is only on PC."

Xbox One is already working on it, and Bethesda is helping with that.
PS4 isn't far behind either. You could argue that these are what would be the beta stages of modding, but that just means modding on consoles will only grow.

What’s the Point?

This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform.
I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across.
I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, this isn’tanti-PC gamer.” If it were up to me, everyone would be a hybrid gamer.
Cheers.
submitted by WhyyyCantWeBeFriends to unpopularopinion [link] [comments]

AMD's Growing CPU Advantage Over Intel

https://seekingalpha.com/article/4152240-amds-growing-cpu-advantage-intel?page=1
AMD's Growing CPU Advantage Over Intel Mar. 1.18 | About: Advanced Micro (AMD)
Raymond Caron, Ph.D. Tech, solar, natural resources, energy (315 followers) Summary AMD's past and economic hazards. AMD's Current market conditions. AMD Zen CPU advantage over Intel. AMD is primarily a CPU fabrication company with much experience and a great history in that respect. They hold patents for 64-bit processing, as well as ARM based processing patents, and GPU architecture patents. AMD built a name for itself in the mid-to-late 90’s when they introduced the K-series CPU’s to good reviews followed by the Athlon series in ‘99. AMD was profitable, they bought the companies NexGen, Alchemy Semiconductor, and ATI. Past Economic Hazards If AMD has such a great history, then what happened? Before I go over the technical advantage that AMD has over Intel, it’s worth looking to see how AMD failed in the past, and to see if those hazards still present a risk to AMD. As for investment purposes we’re more interested in AMD’s turning a profit. AMD suffered from intermittent CPU fabrication problems, and was also the victim of sustained anti-competitive behaviour from Intel who interfered with AMD’s attempts to sell its CPU’s to the market through Sony, Hitachi, Toshiba, Fujitsu, NEC, Dell, Gateway, HP, Acer, and Lenovo. Intel was investigated and/or fined by multiple countries including Japan, Korea, USA, and EU. These hazard needs to be examined to see if history will repeat itself. There have been some rather large changes in the market since then.
1) The EU has shown they are not averse to leveling large fines, and Intel is still fighting the guilty verdict from the last EU fine levied against them; they’ve already lost one appeal. It’s conceivable to expect that the EU, and other countries, would prosecute Intel again. This is compounded by the recent security problems with Intel CPU’s and the fact that Intel sold these CPU’s under false advertising as secure when Intel knew they were not. Here are some of the largest fines dished out by the EU
2) The Internet has evolved from Web 1.0 to 2.0. Consumers are increasing their online presence each year. This reduces the clout that Intel can wield over the market as AMD can more easily sell to consumers through smaller Internet based companies.
3) Traditional distributors (HP, Dell, Lenovo, etc.) are struggling. All of these companies have had recent issues with declining revenue due to Internet competition, and ARM competition. These companies are struggling for sales and this reduces the clout that Intel has over them, as Intel is no longer able to ensure their future. It no longer pays to be in the club. These points are summarized in the graph below, from Statista, which shows “ODM Direct” sales and “other sales” increasing their market share from 2009 to Q3 2017. 4) AMD spun off Global Foundries as a separate company. AMD has a fabrication agreement with Global Foundries, but is also free to fabricate at another foundry such as TSMC, where AMD has recently announced they will be printing Vega at 7nm.
5) Global Foundries developed the capability to fabricate at 16nm, 14nm, and 12nm alongside Samsung, and IBM, and bought the process from IBM to fabricate at 7nm. These three companies have been cooperating to develop new fabrication nodes.
6) The computer market has grown much larger since the mid-90’s – 2006 when AMD last had a significant tangible advantage over Intel, as computer sales rose steadily until 2011 before starting a slow decline, see Statista graph below. The decline corresponds directly to the loss of competition in the marketplace between AMD and Intel, when AMD released the Bulldozer CPU in 2011. Tablets also became available starting in 2010 and contributed to the fall in computer sales which started falling in 2012. It’s important to note that computer shipments did not fall in 2017, they remained static, and AMD’s GPU market share rose in Q4 2017 at the expense of Nvidia and Intel.
7) In terms of fabrication, AMD has access to 7nm on Global Foundries as well as through TSMC. It’s unlikely that AMD will experience CPU fabrication problems in the future. This is something of a reversal of fortunes as Intel is now experiencing issues with its 10nm fabrication facilities which are behind schedule by more than 2 years, and maybe longer. It would be costly for Intel to use another foundry to print their CPU’s due to the overhead that their current foundries have on their bottom line. If Intel is unable to get the 10nm process working, they’re going to have difficulty competing with AMD. AMD: Current market conditions In 2011 AMD released its Bulldozer line of CPU’s to poor reviews and was relegated to selling on the discount market where sales margins are low. Since that time AMD’s profits have been largely determined by the performance of its GPU and Semi-Custom business. Analysts have become accustomed to looking at AMD’s revenue from a GPU perspective, which isn’t currently being seen in a positive light due to the relation between AMD GPU’s and cryptocurrency mining.
The market views cryptocurrency as further risk to AMD. When Bitcoin was introduced it was also mined with GPU’s. When the currency switched to ASIC circuits (a basic inexpensive and simple circuit) for increased profitability (ASIC’s are cheaper because they’re simple), the GPU’s purchased for mining were resold on the market and ended up competing with and hurting new AMD GPU sales. There is also perceived risk to AMD from Nvidia which has favorable reviews for its Pascal GPU offerings. While AMD has been selling GPU’s they haven’t increased GPU supply due to cryptocurrency demand, while Nvidia has. This resulted in a very high cost for AMD GPU’s relative to Nvidia’s. There are strategic reasons for AMD’s current position:
1) While the AMD GPU’s are profitable and greatly desired for cryptocurrency mining, AMD’s market access is through 3rd party resellers whom enjoy the revenue from marked-up GPU sales. AMD most likely makes lower margins on GPU sales relative to the Zen CPU sales due to higher fabrication costs associated with the fabrication of larger size dies and the corresponding lower yield. For reference I’ve included the size of AMD’s and Nvidia’s GPU’s as well as AMD’s Ryzen CPU and Intel’s Coffee lake 8th generation CPU. This suggests that if AMD had to pick and choose between products, they’d focus on Zen due higher yield and revenue from sales and an increase in margin.
2) If AMD maintained historical levels of GPU production in the face of cryptocurrency demand, while increasing production for Zen products, they would maximize potential income for highest margin products (EPYC), while reducing future vulnerability to second-hand GPU sales being resold on the market. 3) AMD was burned in the past from second hand GPU’s and want to avoid repeating that experience. AMD stated several times that the cryptocurrency boom was not factored into forward looking statements, meaning they haven’t produced more GPU’s to expect more GPU sales.
In contrast, Nvidia increased its production of GPU’s due to cryptocurrency demand, as AMD did in the past. Since their Pascal GPU has entered its 2nd year on the market and is capable of running video games for years to come (1080p and 4k gaming), Nvidia will be entering a position where they will be competing directly with older GPU’s used for mining, that are as capable as the cards Nvidia is currently selling. Second-hand GPU’s from mining are known to function very well, with only a need to replace the fan. This is because semiconductors work best in a steady state, as opposed to being turned on and off, so it will endure less wear when used 24/7.
The market is also pessimistic regarding AMD’s P/E ratio. The market is accustomed to evaluating stocks using the P/E ratio. This statistical test is not actually accurate in evaluating new companies, or companies going into or coming out of bankruptcy. It is more accurate in evaluating companies that have a consistent business operating trend over time.
“Similarly, a company with very low earnings now may command a very high P/E ratio even though it isn’t necessarily overvalued. The company may have just IPO’d and growth expectations are very high, or expectations remain high since the company dominates the technology in its space.” P/E Ratio: Problems With The P/E I regard the pessimism surrounding AMD stock due to GPU’s and past history as a positive trait, because the threat is minor. While AMD is experiencing competitive problems with its GPU’s in gaming AMD holds an advantage in Blockchain processing which stands to be a larger and more lucrative market. I also believe that AMD’s progress with Zen, particularly with EPYC and the recent Meltdown related security and performance issues with all Intel CPU offerings far outweigh any GPU turbulence. This turns the pessimism surrounding AMD regarding its GPU’s into a stock benefit. 1) A pessimistic group prevents the stock from becoming a bubble. -It provides a counter argument against hype relating to product launches that are not proven by earnings. Which is unfortunately a historical trend for AMD as they have had difficulty selling server CPU’s, and consumer CPU’s in the past due to market interference by Intel. 2) It creates predictable daily, weekly, monthly, quarterly fluctuations in the stock price that can be used, to generate income. 3) Due to recent product launches and market conditions (Zen architecture advantage, 12nm node launching, Meltdown performance flaw affecting all Intel CPU’s, Intel’s problems with 10nm) and the fact that AMD is once again selling a competitive product, AMD is making more money each quarter. Therefore the base price of AMD’s stock will rise with earnings, as we’re seeing. This is also a form of investment security, where perceived losses are returned over time, due to a stock that is in a long-term upward trajectory due to new products reaching a responsive market.
4) AMD remains a cheap stock. While it’s volatile it’s stuck in a long-term upward trend due to market conditions and new product launches. An investor can buy more stock (with a limited budget) to maximize earnings. This is advantage also means that the stock is more easily manipulated, as seen during the Q3 2017 ER.
5) The pessimism is unfounded. The cryptocurrency craze hasn’t died, it increased – fell – and recovered. The second hand market did not see an influx of mining GPU’s as mining remains profitable.
6) Blockchain is an emerging market, that will eclipse the gaming market in size due to the wide breath of applications across various industries. Vega is a highly desired product for Blockchain applications as AMD has retained a processing and performance advantage over Nvidia. There are more and rapidly growing applications for Blockchain every day, all (or most) of which will require GPU’s. For instance Microsoft, The Golem supercomputer, IBM, HP, Oracle, Red Hat, and others. Long-term upwards trend AMD is at the beginning of a long-term upward trend supported by a comprehensive and competitive product portfolio that is still being delivered to the market, AMD referred to this as product ramping. AMD’s most effective products with Zen is EPYC, and the Raven Ridge APU. EPYC entered the market in mid-December and was completely sold out by mid-January, but has since been restocked. Intel remains uncompetitive in that industry as their CPU offerings are retarded by a 40% performance flaw due to Meltdown patches. Server CPU sales command the highest margins for both Intel and AMD.
The AMD Raven Ridge APU was recently released to excellent reviews. The APU is significant due to high GPU prices driven buy cryptocurrency, and the fact that the APU is a CPU/GPU hybrid which has the performance to play games available today at 1080p. The APU also supports the Vulcan API, which can call upon multiple GPU’s to increase performance, so a system can be upgraded with an AMD or Nvidia GPU that supports Vulcan API at a later date for increased performance for those games or workloads that been programmed to support it. Or the APU can be replaced when the prices of GPU’s fall.
AMD also stands to benefit as Intel confirmed that their new 10 nm fabrication node is behind in technical capability relative to the Samsung, TSMC, and Global Foundries 7 nm fabrication process. This brings into questions Intel’s competitiveness in 2019 and beyond. Take-Away • AMD was uncompetitive with respect to CPU’s from 2011 to 2017 • When AMD was competitive, from 1996 to 2011 they did record profit and bought 3 companies including ATI. • AMD CPU business suffered from: • Market manipulation from Intel. • Intel fined by EU, Japan, Korea, and settled with the USA • Foundry productivity and upgrade complications • AMD has changed • Global Foundries spun off as an independent business • Has developed 14nm &12nm, and is implementing 7nm fabrication • Intel late on 10nm, is less competitive than 7nm node • AMD to fabricate products using multiple foundries (TSMC, Global Foundries) • The market has changed • More AMD products are available on the Internet and both the adoption of the Internet and the size of the Internet retail market has exploded, thanks to the success of smartphones and tablets. • Consumer habits have changed, more people shop online each year. Traditional retailers have lost market share. • Computer market is larger (on-average), but has been declining. While Computer shipments declined in Q2 and Q3 2017, AMD sold more CPU’s. • AMD was uncompetitive with respect to CPU’s from 2011 to 2017. • Analysts look to GPU and Semi-Custom sales for revenue. • Cryptocurrency boom intensified, no crash occurred. • AMD did not increase GPU production to meet cryptocurrency demand. • Blockchain represents a new growth potential for AMD GPU’s. • Pessimism acts as security against a stock bubble & corresponding bust. • Creates cyclical volatility in the stock that can be used to generate profit. • P/E ratio is misleading when used to evaluate AMD. • AMD has long-term growth potential. • 2017 AMD releases competitive product portfolio. • Since Zen was released in March 2017 AMD has beat ER expectations. • AMD returns to profitability in 2017. • AMD taking measureable market share from Intel in OEM CPU Desktop and in CPU market. • High margin server product EPYC released in December 2017 before worst ever CPU security bug found in Intel CPU’s that are hit with detrimental 40% performance patch. • Ryzen APU (Raven Ridge) announced in February 2018, to meet gaming GPU shortage created by high GPU demand for cryptocurrency mining. • Blockchain is a long-term growth opportunity for AMD. • Intel is behind the competition for the next CPU fabrication node. AMD’s growing CPU advantage over Intel About AMD’s Zen Zen is a technical breakthrough in CPU architecture because it’s a modular design and because it is a small CPU while providing similar or better performance than the Intel competition.
Since Zen was released in March 2017, we’ve seen AMD go from 18% CPU market share in the OEM consumer desktops to essentially 50% market share, this was also supported by comments from Lisa Su during the Q3 2017 ER call, by MindFactory.de, and by Amazon sales of CPU’s. We also saw AMD increase its market share of total desktop CPU’s. We also started seeing market share flux between AMD and Intel as new CPU’s are released. Zen is a technical breakthrough supported by a few general guidelines relating to electronics. This provides AMD with an across the board CPU market advantage over Intel for every CPU market addressed.
1) The larger the CPU the lower the yield. - Zen architecture that makes up Ryzen, Threadripper, and EPYC is smaller (44 mm2 compared to 151 mm2 for Coffee Lake). A larger CPU means fewer CPU’s made during fabrication per wafer. AMD will have roughly 3x the fabrication yield for each Zen printed compared to each Coffee Lake printed, therefore each CPU has a much lower cost of manufacturing.
2) The larger the CPU the harder it is to fabricate without errors. - The chance that a CPU will be perfectly fabricated falls exponentially with increasing surface area. Intel will have fewer high quality CPU’s printed compared to AMD. This means that AMD will make a higher margin on each CPU sold. AMD’s supply of perfect printed Ryzen’s (1800X) are so high that the company had to give them away at a reduced cost in order to meet supply demands for the cheaper Ryzen 5 1600X. If you bought a 1600X in August/September, you probably ended up with an 1800X.
3) Larger CPU’s are harder to fabricate without errors on smaller nodes. -The technical capability to fabricate CPU’s at smaller nodes becomes more difficult due to the higher precision that is required to fabricate at a smaller node, and due to the corresponding increase in errors. “A second reason for the slowdown is that it’s simply getting harder to design, inspect and test chips at advanced nodes. Physical effects such as heat, electrostatic discharge and electromagnetic interference are more pronounced at 7nm than at 28nm. It also takes more power to drive signals through skinny wires, and circuits are more sensitive to test and inspection, as well as to thermal migration across a chip. All of that needs to be accounted for and simulated using multi-physics simulation, emulation and prototyping.“ Is 7nm The Last Major Node? “Simply put, the first generation of 10nm requires small processors to ensure high yields. Intel seems to be putting the smaller die sizes (i.e. anything under 15W for a laptop) into the 10nm Cannon Lake bucket, while the larger 35W+ chips will be on 14++ Coffee Lake, a tried and tested sub-node for larger CPUs. While the desktop sits on 14++ for a bit longer, it gives time for Intel to further develop their 10nm fabrication abilities, leading to their 10+ process for larger chips by working their other large chip segments (FPGA, MIC) first.” There are plenty of steps where errors can be created within a fabricated CPU. This is most likely the culprit behind Intel’s inability to launch its 10nm fabrication process. They’re simply unable to print such a large CPU on such a small node with high enough yields to make the process competitive. Intel thought they were ahead of the competition with respect to printing large CPU’s on a small node, until AMD avoided the issue completely by designing a smaller modular CPU. Intel avoided any mention of its 10nm node during its Q4 2017 ER, which I interpret as bad news for Intel shareholders. If you have nothing good to say, then you don’t say anything. Intel having nothing to say about something that is fundamentally critical to its success as a company can’t be good. Intel is on track however to deliver hybrid CPU’s where some small components are printed on 10nm. It’s recently also come to light that Intel’s 10nm node is less competitive than the Global Foundries, Samsung, and TSMC 7nm nodes, which means that Intel is now firmly behind in CPU fabrication. 4) AMD Zen is a new architecture built from the ground up. Intel’s CPU’s are built on-top of older architecture developed with 30-yr old strategies, some of which we’ve recently discovered are flawed. This resulted in the Meltdown flaw, the Spectre flaws, and also includes the ME, and AMT bugs in Intel CPU’s. While AMD is still affected by Spectre, AMD has only ever acknowledged that they’re completely susceptible to Spectre 1, as AMD considers Spectre 2 to be difficult to exploit on an AMD Zen CPU. “It is much more difficult on all AMD CPUs, because BTB entries are not aliased - the attacker must know (and be able to execute arbitrary code at) the exact address of the targeted branch instruction.” Technical Analysis of Spectre & Meltdown * Amd Further reading Spectre and Meltdown: Linux creator Linus Torvalds criticises Intel's 'garbage' patches | ZDNet FYI: Processor bugs are everywhere - just ask Intel and AMD Meltdown and Spectre: Good news for AMD users, (more) bad news for Intel Cybersecurity agency: The only sure defense against huge chip flaw is a new chip Kernel-memory-leaking Intel processor design flaw forces Linux, Windows redesign Take-Away • AMD Zen enjoys a CPU fabrication yield advantage over Intel • AMD Zen enjoys higher yield of high quality CPU’s • Intel’s CPU’s are affected with 40% performance drop due to Meltdown flaw that affect server CPU sales.
AMD stock drivers 1) EPYC • -A critically acclaimed CPU that is sold at a discount compared to Intel. • -Is not affected by 40% software slow-downs due to Meltdown. 2) Raven Ridge desktop APU • - Targets unfed GPU market which has been stifled due to cryptocurrency demand - Customers can upgrade to a new CPU or add a GPU at a later date without changing the motherboard. • - AM4 motherboard supported until 2020. 3) Vega GPU sales to Intel for 8th generation CPU’s with integrated graphics. • - AMD gains access to the complete desktop and mobile market through Intel.
4) Mobile Ryzen APU sales • -Providing gaming capability in a compact power envelope.
5) Ryzen and Threadripper sales • -Fabricated on 12nm in April. • -May eliminate Intel’s last remaining CPU advantage in IPC single core processing. • -AM4 motherboard supported until 2020. • -7nm Ryzen on track for early 2019. 6) Others: Vega, Polaris, Semi-custom, etc. • -I consider any positive developments here to be gravy. Conclusion While in the past Intel interfered with AMD's ability to bring it's products to market, the market has changed. The internet has grown significantly and is now a large market that dominates when in computer sales. It's questionable if Intel still has the influence to affect this new market, and doing so would most certainly result in fines and further bad press.
AMD's foundry problems were turned into an advantage over Intel.
AMD's more recent past was heavily influenced by the failure of the Bulldozer line of CPU's that dragged on AMD's bottom line from 2011 to 2017.
AMD's Zen line of CPU's is a breakthrough that exploits an alternative, superior strategy, in chip design which results in a smaller CPU. A smaller CPU enjoys compounded yield and quality advantages over Intel's CPU architecture. Intel's lead in CPU performance will at the very least be challenged and will more likely come to an end in 2018, until they release a redesigned CPU.
I previously targeted AMD to be worth $20 by the end of Q4 2017 ER. This was based on the speed that Intel was able to get products to market, in comparison AMD is much slower. I believe the stock should be there, but the GPU related story was prominent due to cryptocurrency craze. Financial analysts need more time to catch on to what’s happening with AMD, they need an ER that is driven by CPU sales. I believe that the Q1 2018 is the ER to do that. AMD had EPYC stock in stores when the Meltdown and Spectre flaws hit the news. These CPU’s were sold out by mid-January and are large margin sales.
There are many variables at play within the market, however barring any disruptions I’d expect that AMD will be worth $20 at some point in 2018 due these market drivers. If AMD sold enough EPYC CPU’s due to Intel’s ongoing CPU security problems, then it may occur following the ER in Q1 2018. However, if anything is customary with AMD, it’s that these things always take longer than expected.
submitted by kchia124 to AMD_Stock [link] [comments]

Advice? Going a bit insane.

I have a simple set-up I've been trying to get running stable for about a week now. I have 2x nVIDIA GTX 980 and 2x Radeon 29 280X cards that I want to use.
I have previously used just the GTX 980s and they were stable for weeks. I recently acquired the 280s, and while I can't be certain, I think the cards are not at issue.
I also had two of the 980s and one 280 running in my desktop stable for quite some time, just as a tester. But for the actual mining rig, I was using old AMD AM3-based hardware. And I've had nothing but problems. Murphy's law.
First, I had a motherboard go out (ASUS M4A87TD EVO.) Luckily I had a spare. But the ASUS board had two native PCI-E x16 slots (one running at x1 speed,) and so I had been running the 980s both in the motherboard, and was using this riser extender to run the other cards, with molex->SATA risers (I don't have enough native 4-pin connectors to run without molex to SATA adapters.)
The "new" board (which is actually chronologically older and older tech) is a Gigabyte GA-MA785GM-US2H. It's also socket AM3 but only supports DDR2 memory; luckily I had old DDR2 available. This board also only has one x16 slot, so I am running 3 GPUs outside of the chassis with x16 risers.
It turns out after some nightmare of troubleshooting that one of the DDR2 DIMMs is probably bad. I don't have a replacement at the moment, but I've tried running everything with one DIMM.
I also have Virtual Memory (swap/page file) set to 16GB. I don't know why this is necessary, but other people mention it and NiceHash wouldn't let me run 4 cards without it.
Anyway, all works "fine;" I have the 280s set to 850MHz core clock and +20 power with 100% fans, and 100% fans and native clock on the 980s.
But I keep getting crashes. Before I had freezes, so I also changed out power supplies several times and tried everything you can think of. I don't think the issue is down to power any more, but I'm open to correction if anyone has a suggestion or tip.
For power, I'm using a 3 to 1 ATX-24 pin adapter, with a SeaSonic 620W as the master, and a Corsair 550W as the "slave." The SeaSonic powers the motherboard and the ATX12V for the CPU. It also powers the 3 SATA to molex adapters for the x16 risers, and the 8+6 pin PCI-E power connectors for the 280s.
The Corsair powers the HDD (this I have also suspected of causing problems but I have swapped HDDs and had the same issues, and I really need to get an SSD for this thing,) two 120mm intake fans, two 140mm exhaust fans, and the 6 pin PCI-E connectors for the 980s.
It doesn't seem to freeze anymore. It will mine just fine for a while but eventually I will get a BSOD, usually saying that the device driver has gone into an infinite loop. I believe this is the Crimson driver (I am using version 15.12.)
I've swapped CPUs and memory and HDDs and motherboards and I'm just almost at my wits' end about to tear my hair out trying to get this thing to run stably. I most likely just need to get all new mainboard/CPU/memory, but I can't do that at the moment, so I am looking for any advice/help I can get.
I would assume the driver is going into an infinite loop because it's polling hardware and not getting the response it expects or something like that, which could indicate a power or heat issue, but I don't know. And I've searched for the BSOD error and 90% of the results for "device driver went into an infinite loop" (paraphrasing) were related to AMD Crimson drivers. So that's been where my suspicion has been so far, and I've had other problems with getting the AMD stuff to work when I was initially setting it up. (And had problems with AMD/ATi drivers going back 20 years, ugh.)
So.. Any advice?
TL;DR: Running 2x GTX 980 and 2x R9 280X; keep getting BSOD about device driver going into an infinite loop after some time mining. Want to stab myself.
submitted by mutilatedrabbit to EtherMining [link] [comments]

The Concept of Bitcoin

The Concept of Bitcoin
https://preview.redd.it/5r9soz2ltq421.jpg?width=268&format=pjpg&auto=webp&s=6a89685f735b53ec1573eefe08c8646970de8124
What is Bitcoin?
Bitcoin is an experimental system of transfer and verification of property based on a network of peer to peer without any central authority.
The initial application and the main innovation of the Bitcoin network is a system of digital currency decentralized unit of account is bitcoin.
Bitcoin works with software and a protocol that allows participants to issue bitcoins and manage transactions in a collective and automatic way. As a free Protocol (open source), it also allows interoperability of software and services that use it. As a currency bitcoin is both a medium of payment and a store of value.
Bitcoin is designed to self-regulate. The limited inflation of the Bitcoin system is distributed homogeneously by computing the network power, and will be limited to 21 million divisible units up to the eighth decimal place. The functioning of the Exchange is secured by a general organization that everyone can examine, because everything is public: the basic protocols, cryptographic algorithms, programs making them operational, the data of accounts and discussions of the developers.
The possession of bitcoins is materialized by a sequence of numbers and letters that make up a virtual key allowing the expenditure of bitcoins associated with him on the registry. A person may hold several key compiled in a 'Bitcoin Wallet ', 'Keychain' web, software or hardware which allows access to the network in order to make transactions. Key to check the balance in bitcoins and public keys to receive payments. It contains also (often encrypted way) the private key associated with the public key. These private keys must remain secret, because their owner can spend bitcoins associated with them on the register. All support (keyrings) agrees to maintain the sequence of symbols constituting your keychain: paper, USB, memory stick, etc. With appropriate software, you can manage your assets on your computer or your phone.
Bitcoin on an account, to either a holder of bitcoins in has given you, for example in Exchange for property, either go through an Exchange platform that converts conventional currencies in bitcoins, is earned by participating in the operations of collective control of the currency.
The sources of Bitcoin codes have been released under an open source license MIT which allows to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the software, subject to insert a copyright notice into all copies.
Bitcoin creator, Satoshi Nakamoto
What is the Mining of bitcoin?
Technical details :
During mining, your computer performs cryptographic hashes (two successive SHA256) on what is called a header block. For each new hash, mining software uses a different random number that called Nuncio. According to the content of the block and the nonce value typically used to express the current target. This number is called the difficulty of mining. The difficulty of mining is calculated by comparing how much it is difficult to generate a block compared to the first created block. This means that a difficulty of 70000 is 70000 times more effort that it took to Satoshi Nakamoto to generate the first block. Where mining was much slower and poorly optimized.
The difficulty changes each 2016 blocks. The network tries to assign the difficulty in such a way that global computing power takes exactly 14 days to generate 2016 blocks. That's why the difficulty increases along with the power of the network.
Material :
In the beginning, mining with a processor (CPU) was the only way to undermine bitcoins. (GPU) graphics cards have possibly replaced the CPU due to their nature, which allowed an increase between 50 x to 100 x in computing power by using less electricity by megahash compared to a CPU.
Although any modern GPU can be used to make the mining, the brand AMD GPU architecture has proved to be far superior to nVidia to undermine bitcoins and the ATI Radeon HD 5870 card was the most economical for a time.
For a more complete list of graphics cards and their performance, see Wiki Bitcoin: comparison of mining equipment
In the same way that transition CPU to GPU, the world of mining has evolved into the use of the Field Programmable Gate Arrays (FPGA) as a mining platform. Although FPGAs did not offer an increase of 50 x to 100 x speed of calculation as the transition from CPU to GPU, they offered a better energy efficiency.
A typical HD/s 600 graphics card consumes about 400w of power, while a typical FPGA device can offer a rate of hash of 826 MH/s to 80w of power consumption, a gain of 5 x more calculations for the same energy power. Since energy efficiency is a key factor in the profitability of mining, it was an important step for the GPU to FPGA migration for many people.
The world of the mining of bitcoin is now migrating to the Application Specific Integrated Circuit (ASIC). An ASIC is a chip designed specifically to accomplish a single task. Unlike FPGAs, an ASIC is unable to be reprogrammed for other tasks. An ASIC designed to undermine bitcoins cannot and will not do anything else than to undermine bitcoins.
The stiffness of an ASIC allows us to offer an increase of 100 x computing power while reducing power consumption compared to all other technologies. For example, a classic device to offer 60 GH/s (1 hashes equals 1000 Megahash. 1GH/s = 1000 Mh/s) while consuming 60w of electricity. Compared to the GPU, it is an increase in computing power of 100 x and a reduction of power consumption by a factor of 7.
Unlike the generations of technologies that have preceded the ASIC, ASIC is the "end of the line" when we talk about important technology change. The CPUs have been replaced by the GPUs, themselves replaced by FPGAs that were replaced by ASICs.
There is nothing that can replace the ASICs now or in the immediate future. There will be technological refinements in ASIC products, and improvements in energy efficiency, but nothing that may match increased from 50 x to 100 x the computing power or a 7 x reduction in power consumption compared with the previous technology.
Which means that the energy efficiency of an ASIC device is the only important factor of all product ASIC, since the estimated lifetime of an ASIC device is superior to the entire history of the mining of bitcoin. It is conceivable that a purchased ASIC device today is still in operation in two years if the unit still offers a profitable enough economic to keep power consumption. The profitability of mining is also determined by the value of bitcoin but in all cases, more a device has a good energy efficiency, it is profitable.
Software :
There are two ways to make mining: by yourself or as part of a team (a pool). If you are mining for yourself, you must install the Bitcoin software and configure it to JSON-RPC (see: run Bitcoin). The other option is to join a pool. There are multiple available pools. With a pool, the profit generated by any block generated by a member of the team is split between all members of the team. The advantage of joining a team is to increase the frequency and stability of earnings (this is called reduce the variance) but gains will be lower. In the end, you will earn the same amount with the two approaches. Undermine solo allows you to receive earnings huge but very infrequent, while miner with a pool can offer you small stable and steady gains.
Once you have your software configured or that you have joined a pool, the next step is to configure the mining software. The software the most populare for ASIC/FPGA/GPU currently is CGminer or a derivative designed specifically for FPGAS and ASICs, BFGMiner.
If you want a quick overview of mining without install any software, try Bitcoin Plus, a Bitcoin minor running in your browser with your CPU. It is not profitable to make serious mining, but it is a good demonstration of the principle of the mining team.
submitted by Josephbitcoin to u/Josephbitcoin [link] [comments]

video card options for 4k 60Hz on a budget

I just got a DELL U2718Q 27" 4k display from my job and would like to upgrade my video card from a nVidia GT240 512mb to something that will drive the 4k monitor at 4k. The display is beautiful when driven off my new macbook pro. I haven't bought a video card for a hackintosh in forever and I've read that AMD (ATI to me) is back in the game and cards have gotten stupid expensive due to crypto mining.
Thanks in advance for any advice provided.
More about what is important to me:
submitted by sharpfork to hackintosh [link] [comments]

First impression of BitCoin poker from a USA player

First some background, I am a US player who hasn't played a lick since Black Friday. I have never used BitCoin before (after reading up on it though I am definitely intrigued), and I didn't want to spend any real money (at least not yet).
So instead of going the route of buying BitCoins I set up a miner on my ATI 7950 and let it run for ~2 days until I had .15 BTC (~$1.95 US). I then transfered that to my BitCoin Wallet. Side note. The BitCoin client app that stores your wallet takes a LONG time to sync. It's downloading a ~3GB file that contains all BitCoin transactions EVER. I have a 50MBs connection and it took about 24 hours to catch up which is terrible and I sent them a bug report. I was able to mine in the mean time using a separate app that I got from the BTCGuild.com mining pool. So as previously stated it took about two days to get to the point were I could actually make a deposit.
The card room I picked was SealsWithClubs.eu and they had my transfer almost instantly. That said it took a bit of poking around to make sure I did send the transfer to the right place. It's very obvious that this is a small scale room. That said I was able to quickly sit down at a table and start playing.
Software wise the room is very basic. I don't currently have HEM or PT installed but since the room is flash based in side of your browser I doubt the HUDS will work. There is a way to view your hand history for the table your at but I didn't see a way to save HH locally so I don't know that you can even add the data to your program of choice. On top of that I tried copying and pasting a hand history to weaktight.com and handconverter.com neither of which could read the hand. So for serious play right now it's kind of a bust.
That said I did almost quadruple my funds in about 2.5 hours so that's not bad. All in all it did scratch my poker itch. While the software leaves a lot to be desired I think I will keep playing and encourage others to do so with the hope that it will encourage other rooms to start taking US players with BitCoins.
submitted by argash to poker [link] [comments]

Is bitcoin mining still profitable? (new to btc mining)

So im new to bitcoin mining i tried it on my gaming rig and i only got around 24 megahashes which is pretty bad. I want to try bitcoin mining but im jobless right now and i dont have much money laying so i thought can you start mining bitcoin with 100€? Electricity here costs around .10 usd per kilowatt. Some people said that ati radeon cards are good for mining so im thinking about buying one. Should i save up for a special miner or? Also should i use my gpu and my cpu at the same time to mine?
submitted by Jonniboy87 to BitcoinMining [link] [comments]

Future of Scrypt - GPU and ASIC

Hey guys, so I've been reading a bunch of threads about ASICs lately, namely the Titan from KNC and wanted to share my thoughts on pros and cons. This a big wall of text that I really wanted to write despite being sleepy, so if I made any silly mistakes I apologize in advance, but overall I think I've made my point. =)

Let me know what you think, and while I know it's t3h interwebz, let's try to keep it constructive! =D

GPU Pros/Cons:
The biggest pro of GPU mining is that almost everyone owns one and can participate in mining in any capacity, meaning that coin minting stays decentralized and not in the hands of the relatively few that purchased a bunch of big ASICs.
Another great thing is that we know the speed at which GPUs improve over time. We know that neither ATI nor NVidia will come out with a card two quarters from now that will be several hundred times faster than the last generation so we have a kind of stability in the fact that we can predict how much the next gen computing power will increase and last gen hardware will remain relevant.
Finally, GPUs are primarily made for graphics and we can safely assume that neither NVidia nor ATI will be 'testing' every one of their next gen cards for ~6 months and causing difficulty spikes in mining and then selling them to us after it's no longer profitable to keep them.
Bonus: While not likely at the time being, long term high GPU demand may encourage a new GPU player on the market. ATI and NVidia are great but who's to say it's not time for a threesome? =P I believe I recall reading a while back about intel considering entering the GPU market, maybe they'll reconsider...
So if GPUs are so great, why does anyone bother with ASICs? The single, and rather serious, problem with GPUs is the massive power consumption they create. While GPUs will eventually improve in power consumption as the manufacturing process becomes smaller and smaller it's definitely not fast enough.
ASICs pros/cons
Currently the only real pro of ASICs is the low power consumption. You could say better hash rate but so far the only real ASIC we have is the gridseed which provides worse Hash/$ than a video card but amazing Hash/watt.
The first major con (no pun intended =P) is that the rest of the ASICs don't even have so much as a picture of a prototype unit. In the case of the Titan the claims are just plain extraordinary. 250Mh is over 3125 gridseed chips each running at ~80Kh, which is insane considering the price and power consumption of 800-1000 watts. At 3125 chips running at 1.5W would add up 4687.5 Watts for a single unit. At $10k it would mean each chip would cost you $3.2. Gridseeds are roughly just under $30 a chip. That's one hell of a breakthrough they must have had to boast those stats...
Titan aside, what if the other more realistic sounding ASICs really do ship? Here's the big problem: ASIC manufacturers would eventually have near total control of the mining hardware if the $/hash is equal to or greater than GPUs since the low power use will make them more profitable. Why is this worse than ATI/AMD and NVidia having control over 'mining hardware'? Well they're not making mining hardware, they're making graphics cards which they won't hold to mine themselves for who knows how long before dumping them on consumers. The difference is that ASIC companies have just recently popped up all over the place because there's millions to be made by simply promising to eventually deliver a product which is a terrible business practice at best. GPU companies have nothing to gain, their GPUs will sell regardless and we know they exist and don't have to pre-order anything.
The Big Dilemma:
We have to pick either high power consumption of GPUs or being at the mercy of ASIC manufacturers. I believe that we can truly make the transition to ASIC mining for the sake of efficiency, we MUST eliminate the current business practice of pre-order a product with unknown stats other what is promised by the people that want your money. Here's why we will always lose with this practice: ASIC manufacturers have appeared simply because there's much money to be made from mining hardware. If they can make hardware that has very short return on investment why would they sell it? That hardware will be used for mining by them initially while it's most profitable and once they're ready to move on they'll ship the pre-orders. Think about it, right now those that pre-order give these companies the money to manufacture the ASICs, if they manage to build them and the ROI is high why would they ship when they can mine with them for a while, dust them off and then ship?
Another angle to look at this is what if ASICs like the Titan really do get built in the near future? A few hundred of these 250Mh units will completely eliminate GPU mining due to the massive difficulty spike similar to what happened to bitcoin. Now what you have the entire scrypt network controlled by a few hundred. Even if it'll be a few thousand, spread that out across several scrypt coins and something that is meant to be controlled by millions is now in the hands of a few. It's everything scrypt stands against. It was meant to be ASIC-resistant for a reason.
Final thoughts: Now while I highly doubt they'll actually make a 250Mh scrypt ASIC for only $10k at 1000W, the current business practice of funding unknown products has got to go. We can truly make the transition to ASICs when we reach the point where companies only take pre-orders once they have a fully functional prototype to display, not some terrible render of an empty box, and next gen ASICs don't make the last generation completely irrelevant.
submitted by Atastyham0 to litecoinmining [link] [comments]

I can afford to lose

I first learned of Bitcoin in 2010, installed the client and got my first bitcoin from The Bitcoin Faucet. Thank you Gavin, I assume it was your property.
I fell down the rabbit hole after reading Satoshi's paper (read it about 10 times so far. It gets better each time). I enabled bitcoind's mining feature on my home CPU (and possibly my work machine) and managed to solo mine a block. At the time, they were worth about $0.50 ea. A few months later, and zero blocks found, I compiled some code that allowed me to mine using my nVidia GPU. Still no blocks. Welcome slush pool! Bitcoins were flowing again (about 1/week). At this point, I figured I could make some money at this (price ~$1.00), so I spent $170 on an ATI GPU (about 10 times faster than the nVidia).
Difficulty was increasing, people were building huge GPU rigs - I wasn't one of them. I bought a few coins through MtGox and a few more with paypal. But for the most part, I was happy getting about 1BTC/week on my ATI card. I wasn't ready to spend my "real" money just yet.
I "invested" some assets on GLBSE and put it into a mining company called BITBOND. Was feeling pretty good until GLBSE was shut down and lost 50% of what I invested. I guess Bitcoin Savings and Trust (BST) was a bad idea. But, hey we're just playing around with digital tokens. No worries, I could afford to lose it - maybe $200 worth at the time. Live and learn.
Bitcoin velocity started to increase with Eric's Satoshi Dice and Charlie's BitInstant. Up to $28... Have to buy more. Crash! Back to $2.00 and bailed. Lost another $200.00.
In 2012, I made some paper wallets and gifted all my family members some BTC for x-mas. They were worth about $13/BTC. I still keep an eye on the accounts with watch only addresses on Blockchain.info. Paper wallets were created with Diceware and Amir's sx/libbitcoin, pen and paper.
GPU mining was puttering out, so I spent $300 on a BFL 5GHz ASIC with paypal. I received mine in July of 2013, mined 2 BTC over two weeks, then sold it for $350 on ebay. The mined coins were worth about $300, so I doubled my money. The difficulty increases went lunar, and that BFL miner never mined another full bitcoin. Sorry e-bay buyer.
My mining days were over and Coinbase came online. Since then, I've been lazily buying $50/week using Coinbase's recurring buy feature. All the way up to $1200 and all the way back down to $170. I've been here before. Each time though, I have more coins than I had before - and that's all that matters to me.
Bitcoin's not made me rich. I'm not driving a Ferrari with BITCOIN plates... yet. I'm only in as much as I can afford to lose. It's been a fun ride.
To all the people and companies mentioned, and to all of you building the future, Thank You.
submitted by btc-loser to Bitcoin [link] [comments]

Why Litecoin is Set to Skyrocket

*TL;DR Miner migration to litecoin, litecoin's similarity to proven bitcoin, relatively low market cap, Mt. Gox's API's will soon support litecoin, and litecoin's utility as a very liquid form of money is set to cause litecoin's price to skyrocket. *
Read more at http://zamicol.blogspot.com/2013/04/why-i-think-litecoin-is-set-to.html
Excerpts:
Market Cap
With 333201 blocks mined, and 50 litecoins per block, there is currently 16,660,050 litecoins in circulation. The current market price of $2 USD gives Litecoin a market cap of over $33 million.
This may sound like a lot, but considering that after the “crash” of the past couple of days, Bitcoin’s market cap about $1,157 million USD with the current market price of $98 USD. If litecoin had the market cap of bitcoin, each litecoin would be worth over $69.
Granted, this may not be a fair comparison, since in litecoin’s youth it has not experienced the same proportion of inflation as it’s older brother. While bitcoin is already over its halfway mark to generating its limit of 21 million bitcoins, litecoin is much younger and has not reached the halfway point to its limit of 84 million litcoin. If we factor in that there will be four times as many litecoins as bitcoins, each one of the 84 million litecoins would still be worth over $13 if litecoin had the bitcoin’s market cap.
Litecoin Mining Difficulty
A while back, my ears perked up at the prospects of litecoin because of a new technology is set to give the market a good shaking.
As anticipated, the increase of difficulty of Bitcoin is causing traditional miners to look for more profitable outlets for their existing infrastructure. Hundreds of ASIC miners are now active, which has forced up the difficulty dramatically to an all time high of 7,673,000 with the next difficulty estimated to be near 9,000,000. News of the impending deployment of thousands more is forcing miners to rethink their allocation of existing infrastructure.
Faced with bitcoin’s increasing mining difficulty, GPU Bitcoin miners have three options:
Keep mining bitcoin at a potential loss as electricity costs become much greater than the return on mining. Turn off their miners and sell or retire their hardware. Look for more profitable applications for their existing infrastructure, such as litecoin.
This is where the power of litecoin is very apparent. Due to it’s use of the memory intense scrypt algorithm, dedicated litecoin hardware like ASIC miners are not anticipated in the near future, giving GPU miners a window of opportunity for profit. Switching their GPU hardware from bitcoin to litecoin is only a matter of installing a new mining application and can be done with little configuration. The logical choice for most bitcoin miners will be to move their power to the litecoin network.
In the short time I have been litecoin mining, I have seen the difficulty rise over 600%, meaning that there is six times more computing power dedicated to litecoin in only the past few months. This indicates that many miners have already made the realization that bitcoin offers them a bleak future and made the switch to litecoin. As these miners transition their hardware, litecoin’s mind share will increase, and it shouldn’t be a quickly passing event. Miners can take confidence in litecoin knowing that their infrastructure will be valuable for the foreseeable future, and this confidence is bound to poor over into the market price of litecoin.
As an early bitcoin miner, I remember the supply of ATI 5870’s graphic cards quickly becoming unavailable as individuals bought up supply for use in bitcoin mining. This infrastructure still exists and it will not go to waste.
submitted by Zamicol to litecoin [link] [comments]

Block Reward Halving in >7 days, are you ready?

I've been watching the block count almost everyday for the past 2 months. I'm mining with a single 5830 spitting out 310mHash/s. Took 5 days to mine one Bitcoin when I started, now up to 14 days. Watch E-Bay for cheap ATi video cards.
submitted by willi1147 to Bitcoin [link] [comments]

Looking to get into mining again and questions about multipools

I was a bitcoin miner back when it first came out using fancy ATI graphics cards. It has clearly come a long way since then and I have struggled to keep with all of the different kinds of ASICS that are out there. I am looking to get into mining again and looking for some advice from those who have used multipools:
Thank you /CryptoCurrency compadres!
EDIT: I can't spell multi-pools
submitted by IrrigandumLigno to CryptoCurrency [link] [comments]

How To Mine Doge with Ubuntu 13.10. (Part 1, for AMD Graphics Cards)

I feel the community really needs a resource like this. Because if I had a Doge for every-time taught a newbie how to Mine on Ubuntu, I would be a very rich doge!
EDIT: Sorry if it looks funny, Reddit doesn't like the numbers I put in for some reason.
If you have Ubuntu you have likely not upgraded because you think you cant mine on 13.10, and while that has been true in the past, a few people have got it figured out. So feel free to upgrade. If you are not on Ubuntu, Seriously you should consider using it. Even if its just on an older desktop.
Alright, so lets get a few things clear, I am assuming that you just FRESHLY installed Ubuntu 13.10 x64 on your system. I am assuming that you have no drivers, and no miners. You will need at least a key board and a mouse. The guide will be done in several parts, this first part is for most AMD Graphics cards. If you have an Nvidia, or just want to mine with your CPU, please check back later for those guides!
When reading this guide, please remember that as I give you commands to type into the terminal, Anything between { and} must be included. Do not copy and paste the {}.
Lets get started, If you have not already done so, Install Ubuntu 13.10 64Bit From Here (http://www.ubuntu.com/index_asus.html)
*1. Lets start by making sure you are up to date. Open up the terminal and enter this:
{sudo apt-get update && sudo apt-get upgrade} 
Let that run for a bit.
*2. Once that has finished we are going to download some applications we will need:
{sudo apt-get install dh-make dh-modaliases execstack libxrandr2 libice6 libsm6 libfontconfig1 libxi6 libxcursor1 libgl1-mesa-glx libxinerama1 libqtgui4} 
*3. Once that has run its course we are going to install the ssh server. Remember this command is going to make your computer shut down. While it is shut down, carefully install your graphics cards. If you don't have any experience with this, I suggest you 3 minutes to watch this guide, because failure to install a graphics card the right way can destroy it.
(http://www.youtube.com/watch?v=O9x097QRXeA)
{sudo apt-get install openssh-server sudo shutdown now} 
*4.Install your Card(s) and turn the computer back on.
*5. Make a new folder called "AMD DRIVERS123" inside of your download folder. Downoad the following 3 things into that folder. (1. AMD Drivers: http://support.amd.com/en-us/download/incomplete) (2. AMD APP SDK: http://developer.amd.com/tools-and-sdks/heterogeneous-computing/amd-accelerated-parallel-processing-app-sdk/downloads/) (3. AMD ADL SDK:http://developer.amd.com/tools-and-sdks/graphics-development/display-library-adl-sdk/)
*6. Ok, now we need to unzip and compile those drivers. So, open a terminal my right clicking inside of "AMD DRIVERS123". The enter this:
{unzip amd-catalyst*.zip chmod +x amd-catalyst*.run} {sudo ./amd-catalyst*.run --buildpkg Ubuntu/saucy} 
*7. Now we need to install the drivers. Enter the code exactly as you see it, don't worry, the first command will fail, we are expecting it too.
{sudo dpkg -i fglrx*.deb} {sudo apt-get -f install} {sudo dpkg -i fglrx*.deb} {sudo reboot} 
Your computer should now reboot.
(If this failed, you probibly need to remove the old drivers. Do this
{cd /etc/default sudo pico grub} 
Change the line:
GRUB_CMDLINE_LINUX_DEFAULT="quiet splash nomodeset" 
Now save by pressing [CTRL+x]
{sudo update-grub} this will make the change perminent. {sudo reboot -n} Reboot with new settings. Once it starts back up try installing the new drivers again.) 
*8.GREAT! You are really on a roll! Now we need to update AtiConfig
{sudo aticonfig --initial --adapter=all} 
*9.Time to install the APP ADK:
{tar xvf AMD-APP-SDK*.tgz} {sudo ./Install-AMD-APP.sh} {sudo reboot} 
*10. WOOT! You are so close to diggin that sweet Doge!! Its time to download CGMiner. IMPORTANT NOTE! You need to get CGminer 3.7, IT IS THE ONLY ONE THAT WILL WORK. Do not get any older or newer, Just this. :
{sudo apt-get install git unzip git clone -b 3.7 https://github.com/ckolivas/cgminer} 
*11. Ok, now go back to "AMD DRIVERS123", Unzip AMD SDL SDK 6.0. Once you have it unzipped, go the file called "include", open it up, copy everything inside, then go find CGminer. Copy these files into the CGminer file called ADL_SDK.
*12. To install CGminer, we are going to need a few things. Get all of these.
{sudo apt-get install build-essential autoconf libtool libcurl4-openssl-dev libncurses5-dev pkg-config libudev-dev} 
*13. Lets go ahead and compile CGminer.
{cd cgminer} {./autogen.sh} 
*14. Ok, when you did that, it told you GPU was not supported, time to fix it.
{./configure --enable-opencl --enable-scrypt} 
(if you did it right you should now get this: OpenCL...............: FOUND. GPU mining support enabled scrypt...............: Enabled ADL..................: SDK found, GPU monitoring support enabled)
*15. If everything has been good so far then
{make} 
*16. MATHEMATICAL! Now lets get it set up to run a test.
{nano test.sh} {!/bin/bash export DISPLAY=:0 export GPU_MAX_ALLOC_PERCENT=100 export GPU_USE_SYNC_OBJECTS=1} {./cgminer -n} 
*17. Now save by hitting [Control+x][y][Enter]
*18.Lastly lets CHmod test.sh
{chmod+x test.sh} 
*19. TIME TO RUN THE TEXT!!!
{./test.sh} 
(If your output looks like this your ready to go!!! CL Platform 0 vendor: Advanced Micro Devices, Inc. CL Platform 0 name: AMD Accelerated Parallel Processing CL Platform 0 version: OpenCL 1.2 AMD-APP (1214.3) Platform 0 devices: 1 0 Tahiti GPU 0 AMD Radeon HD 7900 Series hardware monitoring enabled 1 GPU devices max detected)
*20. Excellent! Now we have it installed, its time to pick 2-3 pools, and get accounts set up at each of them. I currently use These 2, and If I find another I like I'll update. If you have a really awesome pool you would like me to include on the list, feel free to message me.
Pools: http://doge.cryptovalley.com/ (great community, server not super stable, but they have a chat. I hang out here a lot under the handle 'StrongBad' feel free to stop buy and ask questions)
https://dogehouse.org/ (Super stable pool, Great contests, super friendly, and they pay your miner a bonus if you find the block!)
Choose your pools, go to their sites and sign up. VERY IMPORTANT!!! Use different usernames and passwords for every mining site. If one site gets hacked, you don't want to give them a way to steal everything!!!!! Don't worry about the worker names and passwords being unique or complex tho, the most they can do with this is mine for you.
*21. Ok, now go to the CGminer folder, and open up a Terminal Window.
{sudo ./cgminer} 
This should start ./cgminer up with some really basic settings, and not pointed at any pool. Lets fix that: First press [p] to go to pool settings Now [A] and enter the information for the 1st pool. For instance if you are signing on for dogehouse: (Input server details: stratum+tcp://stratum.dogehouse.org:3333 Username: 'yourusername.workername password: 'yourpassword') If you got no errors, you did it right! Your miner should now connect and start to mine very slowly.
Now, do the same thing for all your other pools. the reason we do this is because Doge Coin pools are constantly being DDoS attacked, and this way your worker automatically switches over to a good pool if one go's down.
*22. time to save your current settings press [Enter] to get back out to the main menu, then [s] to bring up settings. Now press [W] to write a Config file. Name it DogeCoin.conf, and make sure to save it in the location it directs you too.
*23.OK! So, now we have everything set up to its basics. What you need to do now is sit down, and fine tune your Card(s). Its really early in the morning, and I need to get some sleep, I will add more on how to tune your card tomorrow in another post, and link it HERE:
Or, I will help you find the best configuration for your card(s). However as this is quite a bit of work, I do charge a small fee. Contact me with your card(s) information and I will get back to you right away. I generally let you decide how much to pay me.
This is my first ever guide on reddit! If you enjoyed it, or if it helped you please remember to upvote! I am going to start doing an educational YouTube series about Bitcoin, and will likely be doing a side program about Doge, if I find time and funding!! You can find that here: All tips are appreciated! DPTwcQreASwzt6TeWBWFb6Kz9ZU5Sezvr9 If you have any Questions, feel free to ask, I will get back to you ASAP.
Happy Digging everybody!
submitted by Sonofchange to Dogecoinmining [link] [comments]

Linux headless setup

I'm setting up a mining rig using 3-7950's, and have ran into a little bit of a road block. I've seen a few guides using xubuntu 12.10, but none with xubuntu 13.04.
darth_bunny's guide
cryptobadger's guide
gentoo (bitcoin)
I've tried the first 2 ways on fresh images, but for some reason X never starts...
aticonfig --adapter=0 --od-getclocks ERROR - X needs to be running to perform AMD Overdrive(TM) commands 
Following the darth_bunny's guide way I built to 13.04 instead of 12.10
./amd-driver-installer-catalyst-13.3-beta3-linux-x86.x86_64.run --buildpkg Ubuntu/raring 
On a side note, whenever I plug the ATI cards in, video (connected to onboard), does not output. I'm guessing this is because the ATI cards are primary? It didn't bother me as I ssh in anyway.
I also have some missing lib's when trying to run cgminer, but that is something to worry about later.
Has anyone solved this on 13.04, or should I just downgrade to 12.10?
submitted by atrueresistance to litecoinmining [link] [comments]

'Radeon R9 280X Gaming' crashes when I start playing games

Troubleshooting Help:

What is your parts list? Consider formatting your parts list.
Part list permalink / Part price breakdown by merchant
Type Item Price
GPU Radeon R9 280X GAMING 3G
CPU AMD FX-8320
Motherboard ASUS M5A97 R2.0 AM3+ AMD 970 + SB 950 SATA 6Gb/s USB 3.0 ATX AMD Motherboard with UEFI BIOS
Memory G.Skill 8GB (2 x 4GB) DDR3-1866 Memory
Describe your problem. List any error messages and symptoms. Be descriptive.
I've really only used this desktop for school, but now am switching over to start gaming on it. However, once I start playing a game (like Overwatch) it the screen starts to flicker a bit and then a couple minutes later will go black screen with the computer still running in the back. I cannot click anything, but sometimes I can still hear the game music playing but black screen.
I am able to browse internet, youtube, etc. It's just crashing when I start doing something intensive like playing a game
List anything you've done in attempt to diagnose or fix the problem.
I tried following this guide to flash but the program has some bug in it every time I try it messes up. I've opened up the computer and looked at all the connections and they are set.
Provide any additional details you wish below.
The GPU was preowned, they used it for bitcoin mining briefly, I'm not sure what they did with its settings like overclocking since I don't know much about that and I am not able to reach them either. I also looked at MSI afterburner to see if it was maybe overheating, but it seemed to just be getting up to 47c so I don't think that is the issue either.
submitted by msi280x to buildapc [link] [comments]

GPU bitcoin mining rig ATI Radeon 5850 x 4 bitcoin miner Bitcoin Mining Speed with ATI Radeon HD 5870 GPU Optimization for Crypto Mining (AMD) - YouTube How To Optimize AMD RX 470/480/570/580 Cards for Mining ... Bitcoin mining with 2 amd 7770's - YouTube

Bitcoin mining is big business these days, with people able to 'mine' a digital currency in their own homes. Before Bitcoin had its big take off, mining Bitcoin at home was actually lucrative - if ... Best mining GPU 2020: the best graphics cards for mining Bitcoin, Ethereum and more. By Matt Hanson, Michelle Rae Uy 18 August 2020. Join the cryptocurrency craze with the best mining GPUs. CPU COOL-MINING.COM — All About the World of Cryptocurrencies and Mining: Crypto-Mining on GPU, CPU, ASIC, FPGA, also Software and Firmware for Mining. Software Cryptocurrency Wallets A column of articles from COOL-MINING.COM about cryptocurrency wallets (Bitcoin Core, Electrum, Exodus, Jaxx, Atomic, Monero XMR, Ethereum, Litecoin and many ... AMD GPU Mining AMD Radeon ASIC Miner Awesome miner Bakkt Beam Binance BIOS Editor AMD GPUs Bitcoin Bitcoin (BTC) Bitcoin Cash (BCH) blockchain China Coinbase Covid-19 CPU mining Cryptocurrency CryptoNight Download GMiner Download nanominer DOWNLOAD NiceHash Miner DOWNLOAD PhoenixMiner DOWNLOAD SRBMiner-MULTI DOWNLOAD T-Rex DOWNLOAD TeamRedMiner ... Download Drivers for video cards AMD Radeon RX 470/570/480/580 series l Bitcoin l Ethereum l Blockhain l Resource on crypto-currencies, lock-up and decentralized technologies.

[index] [1213] [5260] [590] [2345] [2152] [3552] [238] [3680] [4643] [4399]

GPU bitcoin mining rig ATI Radeon 5850 x 4 bitcoin miner

#bitcoin Mining Raven on AMD cards 2019. Category Science & Technology; Show more ... Nvidia Said We Couldn't Game On This Crypto Mining Card... - Duration: 16:38. Linus Tech Tips Recommended ... GPU bitcoin mining rig ATI Radeon 5850 x 4 bitcoin miner. ubit pci risers https://amzn.to/2mG73Df Easy way to get started mining crypto download Cudominer https://cudominer.com/?a=TUiJ6lLcg nicehash https://www.nice... 196 Mhash/s per 7770, about 400 combined. Overclock settings are: Power Limit: -2 Core Clock: 1149 Memory Clock: 654 Fan: 100% my Bitcoin Mining Speed with ATI Radeon HD 5870. How Much? can you make from building and mining 6 GPU rig with Ethereum and NiceHash Part 1 - Duration: 19:53. How Much? 887,138 views

#