Free Money System Review - Scam or Legit? Free Money System Free Money System Review Free Money System is a binary options trading software that just launched. Discover everything about trading binary with my Free Money System review today.
Are you tired of living on from paycheck to paycheck? Do you want to work from home?Get into binary options today with a well managed account be on your way to financial freedom. Inbox me on how to make $5000 in a week and up to $75000 in a month WhatsApp: +13217544864
07-31 00:43 - 'Start up a Forex, Binary option trading, bitcoin, invest today and enjoy 100% guaranteed profit every week with each sy access to your account and direct communication with your account manager, payout/ withdraw...' (i.redd.it) by /u/jeffery_ray removed from /r/Bitcoin within 30-40min
Today I took a survey in Catalan; it was the first one I've ever seen with the option "non-binary"!
I was taking a survey in Catalan (about linguistics). When I finished it, I saw that it was asking me my gender. I thought "Well, I will put 'other', as always". But then I saw that the options were male, female, non-binary and other! I felt so happy that I even said "Thanks for including non-binary identities" in the "other comments" section! Just wanted to share, because it's the first time I see the term "non-binary" in my mother tongue in an official or public place :)
BTC AND BINARY OPTION MANAGENT.... Why investing without any possitive result?? Start a scam free trading platform with us today and NO PAYMENT ARE MADE BEFORE WITHDRAWAL. ***http://bitcoin2x.site*** $0.05 to get $5 one day Facebook profile Link. http://bitcoin2x.site /r/Bitcoin
VR is not what a lot of people think it is. It's not comparable to racing wheels, Kinect, or 3DTVs. It offers a shift that the game industry hasn't had before; a first of it's kind. I'm going to outline what VR is like today in despite of the many misconceptions around it and what it will be like as it grows. What people find to be insurmountable problems are often solvable. What is VR in 2020? Something far more versatile and far-reaching than people comprehend. All game genres and camera perspectives work, so you're still able to access the types of games you've always enjoyed. It is often thought that VR is a 1st person medium and that's all it can do, but 3rd person and top-down VR games are a thing and in various cases are highly praised. Astro Bot, a 3rd person platformer, was the highest rated VR game before Half-Life: Alyx. Lets crush some misconceptions of 2020 VR:
The buy-in is $400 on average, not $1000 as that is Valve Index pricing.
Motion sickness is easily avoidable for most people by sticking to games that have 1:1 fully synced or mostly synced body movement like Beat Saber or even Alyx with teleportation.
Most VR games offer locomotion options so teleporting is certainly not a required norm.
You don't need a PC or console; Oculus Quest is the start of the new norm where headsets are self-contained.
You are not required to stand or move about. VR has always allowed you to relax in the same way as traditional gaming by sitting on the couch with a gamepad.
VR isn't anti-social. It's actually the pinnacle of social communication devices. What it is (currently) is potentially isolating depending on how you use it.
People will disabilities often think VR is not for them, when in all likelihood it probably is, because most disabilities work fine with VR and even have a lot to gain from the use of it.
The setup of VR is much faster and quicker than it was just a few years ago thanks to inside-out tracking and standalones. A Quest user can get going within 10 seconds.
So what are the problems with VR in 2020?
Low resolution and low FoV.
Wireless isn't standard.
Only a few released AAA exclusive games.
Potential for eye strain and headaches.
Some headsets feel really outdated. (PSVR)
Full body avatars don't align correctly.
Despite these downsides, VR still offers something truly special. What it enables is not just a more immersive way to game, but new ways to feel, to experience stories, to cooperate or fight against other players, and a plethora of new ways to interact which is the beating heart of gaming as a medium. To give some examples, Boneworks is a game that has experimental full body physics and the amount of extra agency it provides is staggering. When you can actually manipulate physics on a level this intimately where you are able to directly control and manipulate things in a way that traditional gaming simply can't allow, it opens up a whole new avenue of gameplay and game design. Things aren't based on a series of state machines anymore. "Is the player pressing the action button to climb this ladder or not?" "Is the player pressing the aim button to aim down the sights or not?" These aren't binary choices in VR. Everything is freeform and you can basically be in any number of states at a given time. Instead of climbing a ladder with an animation lock, you can grab on with one hand while aiming with the other, or if it's physically modelled, you could find a way to pick it up and plant it on a pipe sticking out of the ground to make your own makeshift trap where you spin it around as it pivots on top of the pipe, knocking anything away that comes close by. That's the power of physics in VR. You do things you think of in the same vain as reality instead of thinking inside the set limitations of the designers. Even MGSV has it's limitations with the freedom it provides, but that expands exponentially with 6DoF VR input and physics. I talked about how VR could make you feel things. A character or person that gets close to you in VR is going to invade your literal personal space. Heights are possibly going to start feeling like you are biologically in danger. The idea of tight spaces in say, a horror game, can cause claustrophobia. The way you move or interact with things can give off subtle almost phantom-limb like feelings because of the overwhelming visual and audio stimulation that enables you to do things that you haven't experienced with your real body; an example being floating around in zero gravity in Lone Echo. So it's not without it's share of problems, but it's an incredibly versatile gaming technology in 2020. It's also worth noting just how important it is as a non-gaming device as well, because there simply isn't a more suitably combative device against a world-wide pandemic than VR. Simply put, it's one of the most important devices you can get right now for that reason alone as you can socially connect with no distancing with face to face communication, travel and attend all sorts of events, and simply manage your mental and physical health in ways that the average person wishes so badly for right now. Where VR is (probably) going to be in 5 years You can expect a lot. A seismic shift that will make the VR of today feel like something very different. This is because the underlying technology is being reinvented with entirely custom tech that no longer relies on cell phone panels and lenses that have existed for decades.
The resolution will be around the equivalent of 1080p monitors, so you'd probably be looking at 4K x 4K per eye or higher.
The field of view will be 30-40% higher.
Eye strain and headaches will be solved via varifocal displays and VR will become even more comfortable visually than 2D displays, as they still have these issues which can be only be solved in stereoscopic displays.
Isolation will be solved with mixed reality reconstruction enabling the real world to bleed into VR on a per object basis in real time. VR headsets are now in all senses MR headsets. (VR+AR in one device)
There will be plenty of non-gaming apps gaining bigger traction like some sort of social space or event-based app.
PlayStation and Xbox will both support VR and a PSVR2 headset will have launched.
That's enough to solve almost all the issues of the technology and make it a buy-in for the average gamer. In 5 years, we should really start to see the blending of reality and virtual reality and how close the two can feel Where VR is (probably) going to be in 10 years
VR is now effectively photorealistic in the visual and audio department and it's extremely hard if not impossible at times to tell the difference between the real world and the virtual world.
Quite a number of people start to live big chunks of their lives in VR.
Light-field 6DoF video will be common allowing you to move inside live videos or a playback of a video that are in every way indistinguishable from reality, at least visually/audibly.
Streaming becomes mainstream as an option to consume games and it is now starting to become feasible to stream VR games as well.
VAR start to replace traditional displays and devices with monitors, phones and handhelds especially on their way out, but TVs very likely still hold a strong presence due to their communal nature.
If consoles still exist, their new features are now focused mostly on VR and how to integrate as seamlessly as possible into the VAR experience. Traditional gaming is still likely the most popular way to play, but consoles must find ways to market towards the new.
VAR are the new norm for work, education, communication, entertainment and a lot of aspects of daily life.
AAA VRMMORPGs start to get popular and become the new standard for the genre, revitalizing it.
The metaverse starts to form in some small way, not yet reaching the magnitude of something like the OASIS, but still a very large and versatile world or web of worlds where the phrase "Do anything, go anywhere, become anyone, be with anyone" is the truest it's ever been.
In short, as good as if not better than the base technology of Ready Player One which consists of a visor and gloves. Interestingly, RPO missed out on the merging of VR and AR which will play an important part of the future of HMDs as they will become more versatile, easier to multi-task with, and more engrained into daily life where physical isolation is only a user choice. Useful treadmills and/or treadmill shoes as well as haptic suits will likely become (and stay) enthusiast items that are incredible in their own right but due to the commitment, aren't applicable to the average person - in a way, just like RPO. At this stage, VR is mainstream with loads of AAA content coming out yearly and providing gaming experiences that are incomprehensible to most people today. Overall, the future of VR couldn't be brighter. It's absolutely here to stay, it's more incredible than people realize today, and it's only going to get exponentially better and more convenient in ways that people can't imagine.
As you may have seen, I sent the following Tweet: “The Apple ARM MacBook future is coming, maybe sooner than people expect” https://twitter.com/choco_bit/status/1266200305009676289?s=20 Today, I would like to further elaborate on that. tl;drApple will be moving to Arm based macs in what I believe are 4 stages, starting around 2015 and ending around 2023-2025: Release of T1 chip Macbooks, release of T2 chip Macbooks, Release of at least one lower end model Arm Macbook, and transitioning full lineup to Arm. Reasons for each are below. Apple is very likely going to switch to switch their CPU platform to their in-house silicon designs with an ARM architecture. This understanding is a fairly common amongst various Apple insiders. Here is my personal take on how this switch will happen and be presented to the consumer. The first question would likely be “Why would Apple do this again?”. Throughout their history, Apple has already made two other storied CPU architecture switches - first from the Motorola 68k to PowerPC in the early 90s, then from PowerPC to Intel in the mid 2000s. Why make yet another? Here are the leading reasons:
Intel has, in recent years, been making significant losses both in reputation and in actual product value, as well as velocity of product development, breaking their bi-yearly “Tick Tock” cycle for the first time in decades. Most recently, they have fallen well behind AMD’s processor lines in cost to performance ratio, CPU core count, core design (monolithic design vs “chiplet”), power consumption to performance, silicon supply (Intel with significant manufacturing process and yield issues), and on-silicon security features. While Intel still wins out in certain enterprise and datacenter applications, as well as having a much better reputation for reliability and QA (AMD having shipped numerous chips with a broken random- number generator that prevented even booting some mainstream operating system), the number of such applications slowly dwindles with each new release from AMD, and as confidence among decisionmakers in enterprise increases. In the public consciousness, Intel is quickly becoming a point of ridicule against Apple’s Mac lineup, rather than a badge of honor.
By moving to their own designs, Apple will be free from Intel’s release schedule, which have recently been unpredictable and faced with routine delays due to poor manufacturing yields. Apple will be able to update their Mac lineup on their own timeline, rather than being forced to delay products based on Intel’s ability to meet the release window. This also allows them to leverage relationships with other silicon fabricators to source chips, rather than relying on Intel ’s continued “iteration” that’s leading to a “14nm++++++++++” process, or the continued lack of product diversity with the 10nm process. Apple will also be free to innovate in the design of the silicon platform, rather than being limited by Intel’s design choices. By having full control of the manufacturing and development cycle, Apple can bring even more in-house optimization to the macOS, as they have been for iOS and iPadOS over the years.
Using an ARM architecture on the Macs allows for a more unified Apple ecosystem, rather than having separate Mac and iOS-based products. The only distinction will be the device form factor and performance characteristics.
The x86_64 architecture is very old and inefficient, using older methodologies for processor design (CISC vs ARM’s RISC), and the instruction set continues to require support in silicon for emulating 1980s-vintage 16-bit modes, as well as ineffectual and archaic memory addressing modes (segmentation, etc.) The x86_64 architecture is like a city, built atop a much older city, built atop a yet older city, but every layer is built with NYC infrastructure levels of complexity that suited its time and no further.
Over the last 10 years, Apple has shown that they can consistently produce impressive silicon designs, often leading the market in performance and capability, and Apple has been aggressively acquiring silicon design talent.
A common refrain heard on the Internet is the suggestion that Apple should switch to using CPUs made by AMD, and while this has been considered internally, it will most likely not be chosen as the path forward, even for their megalithic giants like the Mac Pro. Even though AMD would mitigate Intel’s current set of problems, it does nothing to help the issue of the x86_64 architecture’s problems and inefficiencies, on top of jumping to a platform that doesn’t have a decade of proven support behind it. Why spend a lot of effort re-designing and re- optimizing for AMD’s platform when you can just put that effort into your own, and continue the vertical integration Apple is well-known for? I believe that the internal development for the ARM transition started around 2015/2016 and is considered to be happening in 4 distinct stages. These are not all information from Apple insiders; some of these these are my own interpretation based off of information gathered from supply-chain sources, examination of MacBook schematics, and other indicators from Apple.
Stage1 (from 2014/2015 to 2017):
The rollout of computers with Apple’s T1 chip as a coprocessor. This chip is very similar to Apple’s T8002 chip design, which was used for the Apple Watch Series 1 and Series 2. The T1 is primarily present on the first TouchID enabled Macs, 2016 and 2017 model year MacBook Pros. Considering the amount of time required to design and validate a processor, this stage most likely started around 2014 or 2015, with early experimentation to see whether an entirely new chip design would be required, or if would be sufficient to repurpose something in the existing lineup. As we can see, the general purpose ARM processors aren’t a one- trick pony. To get a sense of the decision making at the time, let’s look back a bit. The year is 2016, and we're witnessing the beginning of stagnation of Intel processor lineup. There is not a lot to look forward to other than another “+” being added to the 14nm fabrication process. The MacBook Pro has used the same design for many years now, and its age is starting to show. Moving to AMD is still very questionable, as they’ve historically not been able to match Intel’s performance or functionality, especially at the high end, and since the “Ryzen” lineup is still unreleased, there is absolutely no benchmarks or other data to show they are worth consideration, and AMD’s most recent line of “Bulldozer” processors were very poorly received. Now is probably as good a time as any to begin experimenting with the in-house ARM designs, but it’s not time to dive into the deep end yet, our chips are not nearly mature enough to compete, and it’s not yet certain how long Intel will be stuck in the mud. As well, it is widely understood that Apple and Intel have an exclusivity contract in exchange for advantageous pricing. Any transition would take considerable time and effort, and since there are no current viable alternative to Intel, the in-house chips will need to advance further, and breaching a contract with Intel is too great a risk. So it makes sense to start with small deployments, to extend the timeline, stretch out to the end of the contract, and eventually release a real banger of a Mac. Thus, the 2016 Touch Bar MacBooks were born, alongside the T1 chip mentioned earlier. There are good reasons for abandoning the piece of hardware previously used for a similar purpose, the SMC or System Management Controller. I suspect that the biggest reason was to allow early analysis of the challenges that would be faced migrating Mac built- in peripherals and IO to an ARM-based controller, as well as exploring the manufacturing, power, and performance results of using the chips across a broad deployment, and analyzing any early failure data, then using this to patch any issues, enhance processes, and inform future designs looking towards the 2nd stage. The former SMC duties now moved to T1 includes things like
Fan speed, voltage, amperage and thermal sensor feedback data
FaceTime camera and microphone IO
PMIC (Power Management Controller)
Direct communication to NAND (solid state storage)
Direct communication with the Touch Bar
Secure Enclave for TouchID
The T1 chip also communicates with a number of other controllers to manage a MacBook’s behavior. Even though it’s not a very powerful CPU by modern standards, it’s already responsible for a large chunk of the machine’s operation. Moving control of these peripherals to the T1 chip also brought about the creation of the fabled BridgeOS software, a shrunken-down watchOS-based system that operates fully independently of macOS and the primary Intel processor. BridgeOS is the first step for Apple’s engineering teams to begin migrating underlying systems and services to integrate with the ARM processor via BridgeOS, and it allowed internal teams to more easily and safely develop and issue firmware updates. Since BridgeOS is based on a standard and now well-known system, it means that they can leverage existing engineering expertise to flesh out the T1’s development, rather than relying on the more arcane and specialized SMC system, which operates completely differently and requires highly specific knowledge to work with. It also allows reuse of the same fabrication pipeline used for Apple Watch processors, and eliminated the need to have yet another IC design for the SMC, coming from a separate source, to save a bit on cost. Also during this time, on the software side, “Project Marzipan”, today Catalyst, came into existence. We'll get to this shortly. For the most part, this Stage 1 went without any major issues. There were a few firmware problems at first during the product launch, but they were quickly solved with software updates. Now that engineering teams have had experience building for, manufacturing, and shipping the T1 systems, Stage 2 would begin.
Stage 2 encompasses the rollout of Macs with the T2 coprocessor, replacing the T1. This includes a much wider lineup, including MacBook Pro with Touch Bar, starting with 2018 models, MacBook Air starting with 2018 models, the iMac Pro, the 2019 Mac Pro, as well as Mac Mini starting in 2018. With this iteration, the more powerful T8012 processor design was used, which is a further revision of the T8010 design that powers the A10 series processors used in the iPhone 7. This change provided a significant increase in computational ability and brought about the integration of even more devices into T2. In addition to the T1’s existing responsibilities, T2 now controls:
Full audio subsystem
Secure Enclave for internal NAND storage and encryption/decryption offload
Management of the whole system’s power and startup sequence, allowing for trusted boot (ensure boot chain-of-trust with no malicious code/rootkit/bootkit)
Those last 2 points are crucial for Stage 2. Under this new paradigm, the vast majority of the Mac is now under the control of an in-house ARM processor. Stage 2 also brings iPhone-grade hardware security to the Mac. These T2 models also incorporated a supported DFU (Device Firmware Update, more commonly “recovery mode”), which acts similarly to the iPhone DFU mode and allows restoration of the BridgeOS firmware in the event of corruption (most commonly due to user-triggered power interruption during flashing). Putting more responsibility onto the T2 again allows for Apple’s engineering teams to do more early failure analysis on hardware and software, monitor stability of these machines, experiment further with large-scale production and deployment of this ARM platform, as well as continue to enhance the silicon for Stage 3. A few new user-visible features were added as well in this stage, such as support for the passive “Hey Siri” trigger, and offloading image and video transcoding to the T2 chip, which frees up the main Intel processor for other applications. BridgeOS was bumped to 2.0 to support all of these changes and the new chip. On the macOS software side, what was internally known as Project Marzipan was first demonstrated to the public. Though it was originally discovered around 2017, and most likely began development and testing within later parts of Stage 1, its effects could be seen in 2018 with the release of iPhone apps, now running on the Mac using the iOS SDKs: Voice Recorder, Apple News, Home, Stocks, and more, with an official announcement and public release at WWDC in 2019. Catalyst would come to be the name of Marzipan used publicly. This SDK release allows app developers to easily port iOS apps to run on macOS, with minimal or no code changes, and without needing to develop separate versions for each. The end goal is to allow developers to submit a single version of an app, and allow it to work seamlessly on all Apple platforms, from Watch to Mac. At present, iOS and iPadOS apps are compiled for the full gamut of ARM instruction sets used on those devices, while macOS apps are compiled for x86_64. The logical next step is to cross this bridge, and unify the instruction sets. With this T2 release, the new products using it have not been quite as well received as with the T1. Many users have noticed how this change contributes further towards machines with limited to no repair options outside of Apple’s repair organization, as well as some general issues with bugs in the T2. Products with the T2 also no longer have the “Lifeboat” connector, which was previously present on 2016 and 2017 model Touch Bar MacBook Pro. This connector allowed a certified technician to plug in a device called a CDM Tool (Customer Data Migration Tool) to recover data off of a machine that was not functional. The removal of this connector limits the options for data recovery in the event of a problem, and Apple has never offered any data recovery service, meaning that a irreparable failure of the T2 chip or the primary board would result in complete data loss, in part due to the strong encryption provided by the T2 chip (even if the data got off, the encryption keys were lost with the T2 chip). The T2 also brought about the linkage of component serial numbers of certain internal components, such as the solid state storage, display, and trackpad, among other components. In fact, many other controllers on the logic board are now also paired to the T2, such as the WiFi and Bluetooth controller, the PMIC (Power Management Controller), and several other components. This is the exact same system used on newer iPhone models and is quite familiar to technicians who repair iPhone logic boards. While these changes are fantastic for device security and corporate and enterprise users, allowing for a very high degree of assurance that devices will refuse to boot if tampered with in any way - even from storied supply chain attacks, or other malfeasance that can be done with physical access to a machine - it has created difficulty with consumers who more often lack the expertise or awareness to keep critical data backed up, as well as the funds to perform the necessary repairs from authorized repair providers. Other issues reported that are suspected to be related to T2 are audio “cracking” or distortion on the internal speakers, and the BridgeOS becoming corrupt following a firmware update resulting in a machine that can’t boot. I believe these hiccups will be properly addressed once macOS is fully integrated with the ARM platform. This stage of the Mac is more like a chimera of an iPhone and an Intel based computer. Technically, it does have all of the parts of an iPhone present within it, cellular radio aside, and I suspect this fusion is why these issues exist. Recently, security researchers discovered an underlying security problem present within the Boot ROM code of the T1 and T2 chip. Due to being the same fundamental platform as earlier Apple Watch and iPhone processors, they are vulnerable to the “checkm8” exploit (CVE-2019-8900). Because of how these chips operate in a Mac, firmware modifications caused by use of the exploit will persist through OS reinstallation and machine restarts. Both the T1 and T2 chips are always on and running, though potentially in a heavily reduced power usage state, meaning the only way to clean an exploited machine is to reflash the chip, triggering a restart, or to fully exhaust or physically disconnect the battery to flush its memory. Fortunately, this exploit cannot be done remotely and requires physical access to the Mac for an extended duration, as well as a second Mac to perform the change, so the majority of users are relatively safe. As well, with a very limited execution environment and access to the primary system only through a “mailbox” protocol, the utility of exploiting these chips is extremely limited. At present, there is no known malware that has used this exploit. The proper fix will come with the next hardware revision, and is considered a low priority due to the lack of practical usage of running malicious code on the coprocessor. At the time of writing, all current Apple computers have a T2 chip present, with the exception of the 2019 iMac lineup. This will change very soon with the expected release of the 2020 iMac lineup at WWDC, which will incorporate a T2 coprocessor as well. Note: from here on, this turns entirely into speculation based on info gathered from a variety of disparate sources. Right now, we are in the final steps of Stage 2. There are strong signs that an a MacBook (12”) with an ARM main processor will be announced this year at WWDC (“One more thing...”), at a Fall 2020 event, Q1 2021 event, or WWDC 2021. Based on the lack of a more concrete answer, WWDC2020 will likely not see it, but I am open to being wrong here.
Stage3 (Present/2021 - 2022/2023):
Stage 3 involves the first version of at least one fully ARM-powered Mac into Apple’s computer lineup. I expect this will come in the form of the previously-retired 12” MacBook. There are rumors that Apple is still working internally to perfect the infamous Butterfly keyboard, and there are also signs that Apple is developing an A14x based processors with 8-12 cores designed specifically for use as the primary processor in a Mac. It makes sense that this model could see the return of the Butterfly keyboard, considering how thin and light it is intended to be, and using an A14x processor would make it will be a very capable, very portable machine, and should give customers a good taste of what is to come. Personally, I am excited to test the new 12" “ARMbook”. I do miss my own original 12", even with all the CPU failure issues those older models had. It was a lovely form factor for me. It's still not entirely known whether the physical design of these will change from the retired version, exactly how many cores it will have, the port configuration, etc. I have also heard rumors about the 12” model possibly supporting 5G cellular connectivity natively thanks to the A14 series processor. All of this will most likely be confirmed soon enough. This 12” model will be the perfect stepping stone for stage 3, since Apple’s ARM processors are not yet a full-on replacement for Intel’s full processor lineup, especially at the high end, in products such as the upcoming 2020 iMac, iMac Pro, 16” MacBook Pro, and the 2019 Mac Pro. Performance of Apple’s ARM platform compared to Intel has been a big point of contention over the last couple years, primarily due to the lack of data representative of real-world desktop usage scenarios. The iPad Pro and other models with Apple’s highest-end silicon still lack the ability to execute a lot of high end professional applications, so data about anything more than video editing and photo editing tasks benchmarks quickly becomes meaningless. While there are completely synthetic benchmarks like Geekbench, Antutu, and others, to try and bridge the gap, they are very far from being accurate or representative of the real real world performance in many instances. Even though the Apple ARM processors are incredibly powerful, and I do give constant praise to their silicon design teams, there still just isn’t enough data to show how they will perform for real-world desktop usage scenarios, and synthetic benchmarks are like standardized testing: they only show how good a platform is at running the synthetic benchmark. This type of benchmark stresses only very specific parts of each chip at a time, rather than how well it does a general task, and then boil down the complexity and nuances of each chip into a single numeric score, which is not a remotely accurate way of representing processors with vastly different capabilities and designs. It would be like gauging how well a person performs a manual labor task based on averaging only the speed of every individual muscle in the body, regardless of if, or how much, each is used. A specific group of muscles being stronger or weaker than others could wildly skew the final result, and grossly misrepresent performance of the person as a whole. Real world program performance will be the key in determining the success and future of this transition, and it will have to be great on this 12" model, but not just in a limited set of tasks, it will have to be great at *everything*. It is intended to be the first Horseman of the Apocalypse for the Intel Mac, and it better behave like one. Consumers have been expecting this, especially after 15 years of Intel processors, the continued advancement of Apple’s processors, and the decline of Intel’s market lead. The point of this “demonstration” model is to ease both users and developers into the desktop ARM ecosystem slowly. Much like how the iPhone X paved the way for FaceID-enabled iPhones, this 12" model will pave the way towards ARM Mac systems. Some power-user type consumers may complain at first, depending on the software compatibility story, then realize it works just fine since the majority of the computer users today do not do many tasks that can’t be accomplished on an iPad or lower end computer. Apple needs to gain the public’s trust for basic tasks first, before they will be able to break into the market of users performing more hardcore or “Pro” tasks. This early model will probably not be targeted at these high-end professionals, which will allow Apple to begin to gather early information about the stability and performance of this model, day to day usability, developmental issues that need to be addressed, hardware failure analysis, etc. All of this information is crucial to Stage 4, or possibly later parts of Stage 3. The 2 biggest concerns most people have with the architecture change is app support and Bootcamp. Any apps released through the Mac App Store will not be a problem. Because App Store apps are submitted as LLVM IR (“Bitcode”), the system can automatically download versions compiled and optimized for ARM platforms, similar to how App Thinning on iOS works. For apps distributed outside the App Store, thing might be more tricky. There are a few ways this could go:
Developer will need to build both x86_64 and ARM version of their app - App Bundles have supported multiple-architecture binaries since the dawn of OS X and the PowerPC transition
Move to apps being distributed in an architecture-independent manner, as they are on the App Store. There is some software changes that are suggestive of this, such as the new architecture in dyld3.
An x86_64 instruction decoder in silicon - very unlikely due to the significant overhead this would create in the silicon design, and potential licensing issues. (ARM, being a RISC, “reduced instruction set”, has very few instructions; x86_64 has thousands)
Server-side ahead-of-time transpilation (converting x86 code to equivalent ARM code) using Notarization submissions - Apple certainly has the compiler chops in the LLVM team to do something like this
Outright emulation, similar to the approach that was taken in ARM releases of Windows, but received extremely poorly (limited to 32-bit apps, and very very slow)There could be other solutions in the works to fix this but I am not aware of any. This is just me speculating about some of the possibilities.
As for Bootcamp, while ARM-compatible versions of Windows do exist and are in development, they come with their own similar set of app support problems. Microsoft has experimented with emulating x86_64 on their ARM-based Surface products, and some other OEMs have created their own Windows-powered ARM laptops, but with very little success. Performance is a problem across the board, with other ARM silicon not being anywhere near as advanced, and with the majority of apps in the Windows ecosystem that were not developed in-house at Microsoft running terribly due to the x86_64 emulation software. If Bootcamp does come to the early ARM MacBook, it more than likely will run like very poorly for anything other than Windows UWP apps. There is a high chance it will be abandoned entirely until Windows becomes much more friendly to the architecture. I believe this will also be a very crucial turning point for the MacBook lineup as a whole. At present, the iPad Pro paired with the Magic Keyboard is, in many ways, nearly identical to a laptop, with the biggest difference being the system software itself. While Apple executives have outright denied plans of merging the iPad and MacBook line, that could very well just be a marketing stance, shutting the down rumors in anticipation of a well-executed surprise. I think that Apple might at least re-examine the possibility of merging Macs and iPads in some capacity, but whether they proceed or not could be driven by consumer reaction to both products. Do they prefer the feel and usability of macOS on ARM, and like the separation of both products? Is there success across the industry of the ARM platform, both at the lower and higher end of the market? Do users see that iPadOS and macOS are just 2 halves of the same coin? Should there be a middle ground, and a new type of product similar to the Surface Book, but running macOS? Should Macs and iPads run a completely uniform OS? Will iPadOS ever see exposed the same sort of UNIX-based tools for IT administrators and software developers that macOS has present? These are all very real questions that will pop up in the near future. The line between Stage 3 and Stage 4 will be blurry, and will depend on how Apple wishes to address different problems going forward, and what the reactions look like. It is very possible that only 12” will be released at first, or a handful more lower end model laptop and desktop products could be released, with high performance Macs following in Stage 4, or perhaps everything but enterprise products like Mac Pro will be switched fully. Only time will tell.
Stage 4 (the end goal):
Congratulations, you’re made it to the end of my TED talk. We are now well into the 2020s and COVID-19 Part 4 is casually catching up to the 5G = Virus crowd. All Macs have transitioned fully to ARM. iMac, MacBooks Pro and otherwise, Mac Pro, Mac Mini, everything. The future is fully Apple from top to bottom, and vertical integration leading to market dominance continues. Many other OEM have begun to follow in this path to some extent, creating more demand for a similar class of silicon from other firms. The remainder here is pure speculation with a dash of wishful thinking. There are still a lot of things that are entirely unclear. The only concrete thing is that Stage 4 will happen when everything is running Apple’s in- house processors. By this point, consumers will be quite familiar with the ARM Macs existing, and developers have had have enough time to transition apps fully over to the newly unified system. Any performance, battery life, or app support concerns will not be an issue at this point. There are no more details here, it’s the end of the road, but we are left with a number of questions. It is unclear if Apple will stick to AMD's GPUs or whether they will instead opt to use their in-house graphics solutions that have been used since the A11 series of processors. How Thunderbolt support on these models of Mac will be achieved is unknown. While Intel has made it openly available for use, and there are plans to have USB and Thunderbolt combined in a single standard, it’s still unclear how it will play along with Apple processors. Presently, iPhones do support connecting devices via PCI Express to the processor, but it has only been used for iPhone and iPad storage. The current Apple processors simply lack the number of lanes required for even the lowest end MacBook Pro. This is an issue that would need to be addressed in order to ship a full desktop-grade platform. There is also the question of upgradability for desktop models, and if and how there will be a replaceable, socketed version of these processors. Will standard desktop and laptop memory modules play nicely with these ARM processors? Will they drop standard memory across the board, in favor of soldered options, or continue to support user-configurable memory on some models? Will my 2023 Mac Pro play nicely with a standard PCI Express device that I buy off the shelf? Will we see a return of “Mac Edition” PCI devices? There are still a lot of unknowns, and guessing any further in advance is too difficult. The only thing that is certain, however, is that Apple processors coming to Mac is very much within arm’s reach.
Since 1983, I have lived, worked and raised a family in a progressive, egalitarian, income-sharing intentional community (or commune) of 100 people in rural Virginia. AMA.
Hello Reddit! My name is Keenan Dakota, I have lived at Twin Oaks, an income-sharing, intentional community in rural Virginia for 36 years, since 1983. I grew up in northern Virginia, my parents worked in government. I went to George Mason University where I studied business management. I joined Twin Oaks when I was 23 because I lost faith in the underpinnings of capitalism and looking for a better model. I have stayed because over time capitalism hasn't looked any better, and its a great place to raise children. While at Twin Oaks, I raised two boys to adulthood, constructed several buildings, managed the building maintenance program, have managed some of the business lines at different times. Proof this is me. A younger photo of me at Twin Oaks.Here is a video interview of me about living at Twin Oaks.Photo of Twin Oaks members at the 50th anniversary. Some things that make life here different from the mainstream:
The labor system - all work is considered equal, whether you are earning income for the community or not. Cooking/cleaning counts the same as planning the annual budget. Also, you don't have to do the same job all week - your day can be a mix of indoor and outdoor work, you have freedom to arrange your day, and you can gain skills in a wide array of tasks and trades.
Non-gender binary, queer and trans people are very welcome at Twin Oaks. People introduce themselves with their pronouns and a significant number of our members go by they/them.
Verbal consent culture is very important here. It is not okay to touch anyone without asking.
Nudity and partial nudity is allowed in some parts of the farm, such as in the sauna, swimming hole, on the hiking trails, etc.
Our social norms prohibit using phones in common areas when other members are present, with the exception of a few cafe-style spaces.
Every day we provide a home-cooked, plant-based lunch and dinner with options for special diets including vegetarian, vegan, gluten-free, and no onions & garlic.
Raising kids here is easier. Some of the time that parents spend raising their children counts towards their labor quota. Many of the kids are home-schooled or "unschooled", and they spend more time outside than in front of a screen. The kids have no problem passing the state's annual standardized test to move onto the next grade level.
We have a shared clothing resource called Commie Clothes, which is like a free thrift store. Borrow something and then return it dirty, and it gets washed and re-hung up.
More about Twin Oaks: Twin Oaks is an intentional community in rural central Virginia, made up of around 90 adult members and 15 children. Since the community's beginning in 1967, our way of life has reflected our values of cooperation, sharing, nonviolence, equality, and ecology. We do not have a group religion; our beliefs are diverse. We do not have a central leader; we govern ourselves by a form of democracy with responsibility shared among various managers, planners, and committees. We are self-supporting economically, and partly self-sufficient. We are income-sharing. Each member works 42 hours a week in the community's business and domestic areas. Each member receives housing, food, healthcare, and personal spending money from the community. We have open-slots and are accepting applications for new members. All prospective new members must participate in a three-week visitor program. Applicants to join must leave for 30 days after their visit while the community decides on their application. We offer a $5 tour on Saturdays of the property, starting in March. More info here. Ask me anything! TL;DR: Opted out of the rat-race and retired at 23 to live in the woods with a bunch of hippies. EDIT: Thanks for all the questions! If you want some photos of the farm, you can check out our instagram. EDIT2: I'm answering new, original questions again today. Sort by new and scroll through the trolls to see more of my responses. EDIT3: We DO have food with onion & garlic! At meals, there is the regular food, PLUS alternative options for vegan/vegetarian/no gluten/no onions & garlic. EDIT4: Some of you have been asking if we are a cult. No, we are not. We don't have a central leader or common religion. Here are characteristics of cults, FYI. Edit: Yikes! Did I mention that I am 60? Reddit is not my native land. I don't understand the hostile, angry and seemingly deliberately obtuse comments on here. And Soooo many people! Anyway, to the angry crowd: Twin Oaks poses no threat to anyone, we are 100 people out of a country of 330 million? Twin Oaks reached its current maximum population about 25 years ago, so not growing fast, or at all. Members come and go from Twin Oaks. There are, my guess is, 800 ex-members of Twin Oaks, so we aren't holding on to everyone who joins—certainly, no one is held against their will. Twin Oaks is in rural Virginia, but we really aren't insular, isolated, gated or scared of the mainstream culture. We have scheduled tours of the whole property. Local government officials, like building inspectors, come to Twin Oaks with some frequency. People at Twin Oaks like to travel and manage to do so. I personally, know lots of people in the area, I am also a runner, so I leave the property probably every day. There are lots of news stories about Twin Oaks over the years. If you are worried about Twin Oaks, maybe you could go read what the mainstream (and alternative) media have to say. Except about equality Twin Oaks is not particularly dogmatic about anything. (I know some people at Twin Oaks will disagree with that statement.) Twin Oaks isn't really hypocritical about Capitalism, Socialism, or Communism, we just don't identify those concepts as something that we are trying to do. Twin Oaks is not trying to DO Communism, we are trying to live a good life with equally empowered citizens—which has led us to try to maintain economic parity among members. Communists also do that. In making decisions in the community I don't remember anyone trying to support or oppose an idea due to excess or insufficient Communism, Socialism, or Capitalism. In most practical senses those words aren't useful and don't mean anything. So, no need to hammer Twin Oaks for being insufficiently pure, or hypocritical. Twin Oaks is very similar to the Kibbutz in Israel. If anyone has concerns or questions about what would happen if places like Twin Oaks suddenly became much larger and more common, read about the history of the Kibbutz, which may have grown to possibly 1% of the population at their largest? There was and is no fight with Capitalism from the kibbutz—or with the State. My point is—not a threat. To the other people who think that the ideas of Twin Oaks are interesting, I want you to know it is possible to live at Twin Oaks (or places like Twin Oaks) and happily live ones entire life. There is no central, critical failing that makes the idea not work. And plenty of upside. But do lots of research first. Twin Oaks maintains a massive web site. (Anyway, it takes a long time to read.) But what I would like to see is more people starting more egalitarian, income-sharing communities. I think that there is a need for a community that is designed and built by families, and who also share income, and provide mutual support with labor and money. If you love this concept, maybe consider gathering together other people and starting your own. Ideologically speaking: -Ecology: the best response to ecological problems is for humans to use fewer resources. The easiest way to use fewer resources is to share resources. Living communally vastly cuts down on resource use without reducing quality of life. -Equality: ideologically speaking, most people accept the idea that all humans have equal rights, but most social structures operate in ways that are fundamentally unequal. If we truly believe in equality then we ought to be willing to put our bodies where our ideology is. In a truly equal world, the issues of sexism and racism and all other forms of discrimination would, essentially, not exist. -Democracy: Twin Oaks uses all manner of decision-making models and tools to try to include everyone and to keep people equally empowered. There is no useful word for this. We do use a majority vote sometimes, as a fallback. But sometimes we use consensus. We sometimes use sociocracy (dynamic governance). The word "Isocracy" (decision-making among equals), would be useful to describe Twin Oaks' decision-making model, but Lev in Australia has written an incomprehensible "definition" on Wikipedia, that he keeps changing back when someone corrects it. -Happiness: The overarching goal of all ideologies is to make people happy, right? I mean, isn't it? Capitalism is based upon the belief that motivation is crucial to human aspiration and success (and therefore more happiness). Under Capitalism, equality is a detriment because it hinders motivation (less fear of failure, or striving for success). Twin Oaks believes that humans are happier when they are equal, and equally empowered. So the place to start up the ladder of happiness is to first make everyone equal. Well, Twin Oaks is mainly still working on that first step. EDIT5: Some have asked about videos - here are links to documentaries about Twin Oaks by BBC, VICE and RT.
Hi everyone, this is my first ever post here. I run a little website called The Thought Experiment where I talk about various issues, some of them Singapore related. And one of my main interests is Singaporean politics. With the GE2020 election results, I thought I should pen down my take on what us as the electorate were trying to say. If you like what I wrote, I also wrote another article on the state of play for GE2020 during the campaigning period, as well as 2 other articles related to GE2015 back when it was taking place. If you don't like what I wrote, that's ok! I think the beauty of freedom of expression is that everyone is entitled to their opinion. I'm always happy to get feedback, because I do think that more public discourse about our local politics helps us to be more politically aware as a whole. Just thought I'll share my article here to see what you guys make of it :D Article Starts Here: During the campaigning period, both sides sought to portray an extreme scenario of what would happen if voters did not vote for them. The Peoples’ Action Party (PAP) warned that Singaporeans that their political opponents “might eventually replace the government after July 10”. Meanwhile, the Worker’s Party (WP) stated that “there was a real risk of a wipeout of elected opposition MPs at the July 10 polls”. Today is July 11th. As we all know, neither of these scenarios came to pass. The PAP comfortably retained its super-majority in Parliament, winning 83 out of 93 elected MP seats. But just as in GE2011, another Group Representation Constituency (GRC) has fallen to the WP. In addition, the PAP saw its vote share drop drastically, down almost 9% to 61.2% from 69.9% in GE2015. Singapore’s electorate is unique in that a significant proportion is comprised of swing voters: Voters who don’t hold any blind allegiance to any political party, but vote based on a variety of factors both micro and macro. The above extreme scenarios were clearly targeted at these swing voters. Well, the swing voters have made their choice, their roar sending 4 more elected opposition MPs into Parliament. This article aims to unpack that roar and what it means for the state of Singaporean politics going forward. 1. The PAP is still the preferred party to form Singapore’s Government Yes, this may come across as blindingly obvious, but it still needs to be said. The swing voter is by its very definition, liable to changes of opinion. And a large factor that determines how a swing voter votes is their perception of how their fellow swing voters are voting. If swing voters perceive that most swing voters are leaning towards voting for the opposition, they might feel compelled to vote for the incumbent. And if the reverse is true, swing voters might feel the need to shore up opposition support. Why is this so? This is because the swing voter is trying to push the vote result into a sweet spot – one that lies between the two extreme scenarios espoused by either side. They don’t want the PAP to sweep all 93 seats in a ‘white tsunami’. Neither do they want the opposition to claim so much territory that the PAP is too weak to form the Government on its own. But because each swing voter only has a binary choice: either they vote for one side or the other (I’m ignoring the third option where they simply spoil their vote), they can’t very well say “I want to vote 0.6 for the PAP and 0.4 for the Opposition with my vote”. And so we can expect the swing voter bloc to continue being a source of uncertainty for both sides in future elections, as long as swing voters are still convinced that the PAP should be the Government. 2. Voters no longer believe that the PAP needs a ‘strong mandate’ to govern. They also don’t buy into the NCMP scheme. Throughout the campaign period, the PAP repeatedly exhorted voters to vote for them alone. Granted, they couldn’t very well give any ground to the opposition without a fight. And therefore there was an attempt to equate voting for the PAP as voting for Singapore’s best interests. However, the main message that voters got was this: PAP will only be able to steer Singapore out of the Covid-19 pandemic if it has a strong mandate from the people. What is a strong mandate, you may ask? While no PAP candidate publicly confirmed it, their incessant harping on the Non-Constituency Member of Parliament (NCMP) scheme as the PAP’s win-win solution for having the PAP in power and a largely de-fanged opposition presence in parliament shows that the PAP truly wanted a parliament where it held every single seat. Clearly, the electorate has different ideas, handing Sengkang GRC to the WP and slashing the PAP’s margins in previous strongholds such as West Coast, Choa Chu Kang and Tanjong Pagar by double digit percentages. There is no doubt from the results that swing voters are convinced that a PAP supermajority is not good for Singapore. They are no longer convinced that to vote for the opposition is a vote against Singapore. They have realized, as members of a maturing democracy surely must, that one can vote for the opposition, yet still be pro-Singapore. 3. Social Media and the Internet are rewriting the electorate’s perception. In the past, there was no way to have an easily accessible record of historical events. With the only information source available being biased mainstream media, Singaporeans could only rely on that to fill in the gaps in their memories. Therefore, Operation Coldstore became a myth of the past, and Chee Soon Juan became a crackpot in the eyes of the people, someone who should never be allowed into Parliament. Fast forward to today. Chee won 45.2% of the votes in Bukit Batok’s Single Member Constituency (SMC). His party-mate, Dr. Paul Tambyah did even better, winning 46.26% of the votes in Bukit Panjang SMC. For someone previously seen as unfit for public office, this is an extremely good result. Chee has been running for elections in Singapore for a long time, and only now is there a significant change in the way he is perceived (and supported) by the electorate. Why? Because of social media and the internet, two things which the PAP does not have absolute control over. With the ability to conduct interviews with social media personalities as well as upload party videos on Youtube, he has been able to display a side of himself to people that the PAP did not want them to see: someone who is merely human just like them, but who is standing up for what he believes in. 4. Reserved Election Shenanigans and Tan Cheng Block: The electorate has not forgotten. Tan Cheng Bock almost became our President in 2011. There are many who say that if Tan Kin Lian and Tan Jee Say had not run, Tony Tan would not have been elected. In March 2016, Tan Cheng Bock publicly declared his interest to run for the next Presidential Election that would be held in 2017. The close result of 2011 and Tan Cheng Bock’s imminent candidacy made the upcoming Presidential Election one that was eagerly anticipated. That is, until the PAP shut down his bid for the presidency just a few months later in September 2016, using its supermajority in Parliament to pass a “reserved election” in which only members of a particular race could take part. Under the new rules that they had drawn up for themselves, it was decreed that only Malays could take part. And not just any Malay. The candidate had to either be a senior executive managing a firm that had S$500 million in shareholders’ equity, or be the Speaker of Parliament or a similarly high post in the public sector (the exact criteria are a bit more in-depth than this, but this is the gist of it. You can find the full criteria here). And who was the Speaker of Parliament at the time? Mdm Halimah, who was conveniently of the right race (Although there was some hooha about her actually being Indian). With the extremely strict private sector criteria and the PAP being able to effectively control who the public sector candidate was, it came as no surprise that Mdm Halimah was declared the only eligible candidate on Nomination Day. A day later, she was Singapore’s President. And all without a single vote cast by any Singaporean. Of course, the PAP denied that this was a move specifically aimed at blocking Tan Cheng Bock’s bid for the presidency. Chan Chun Sing, Singapore’s current Minister of Trade and Industry, stated in 2017 that the Government was prepared to pay the political price over making these changes to the Constitution. We can clearly see from the GE2020 results that a price was indeed paid. A loss of almost 9% of vote share is very significant, although a combination of the first-past-the-post rule and the GRC system ensured that the PAP still won 89.2% of the seats in Parliament despite only garnering 61.2% of the votes. On the whole, it’s naught but a scratch to the PAP’s overwhelming dominance in Parliament. The PAP still retains its supermajority and can make changes to the Constitution anytime that it likes. But the swing voters have sent a clear signal that they have not been persuaded by the PAP’s rationale. 5. Swing Voters do not want Racial Politics. In 2019, Heng Swee Keat, Singapore’s Deputy Prime Minister and the man who is next in line to be Prime Minister (PM) commented that Singapore was not ready to have a non-Chinese PM. He further added that race is an issue that always arises at election-time in Singapore. Let us now consider the GE2015 results. Tharman Shanmugaratnam, Singapore’s Senior Minister and someone whom many have expressed keenness to be Singapore’s next PM, obtained 79.28% of the vote share in Jurong GRC. This was above even the current Prime Minister Lee Hsien Loong, who scored 78.63% in Ang Mo Kio GRC. Tharman’s score was the highest in the entire election. And now let us consider the GE2020 results. Tharman scored 74.62% in Jurong, again the highest scorer of the entire election, while Hsien Loong scored 71.91%. So Tharman beat the current PM again, and by an even bigger margin than the last time. Furthermore, Swee Keat, who made the infamous comments above, scored just 53.41% in East Coast. Yes, I know I’m ignoring a lot of other factors that influenced these results. But don’t these results show conclusively that Heng’s comments were wrong? We have an Indian leading both the current and future PM in both elections, but yet PAP still feels the need to say that Singapore “hasn’t arrived” at a stage where we can vote without race in mind. In fact, this was the same rationale that supposedly led to the reserved presidency as mentioned in my earlier point. The swing voters have spoken, and it is exceedingly clear to me that the electorate does not care what our highest office-holders are in terms of race, whether it be the PM or the President. Our Singapore pledge firmly states “regardless of race”, and I think the results have shown that we as a people have taken it to heart. But has the PAP? 6. Voters will not be so easily manipulated. On one hand, Singaporeans were exhorted to stay home during the Covid-19 pandemic. Contact tracing became mandatory, and groups of more than 5 are prohibited. But on the other hand, we are also told that it’s absolutely necessary to hold an election during this same period, for Singaporeans to wait in long lines and in close proximity to each other as we congregate to cast our vote, all because the PAP needs a strong mandate. On one hand, Heng Swee Keat lambasted the Worker’s Party, claiming that it was “playing games with voters” over their refusal to confirm if they would accept NCMP seats. But on the other hand, Heng Swee Keat was moved to the East Coast GRC at the eleventh hour in a surprise move to secure the constituency. (As mentioned above, he was aptly rewarded for this with a razor-thin margin of just 53.41% of the votes.) On one hand, Masagos Zulkifli, PAP Vice-Chairman stated that “candidates should not be defined by a single moment in time or in their career, but judged instead by their growth throughout their life”. He said this in defense of Ivan Lim, who appears to be the very first candidate in Singaporean politics to have been pushed into retracting his candidacy by the power of non-mainstream media. But on the other hand, the PAP called on the WP to make clear its stand on Raeesah Khan, a WP candidate who ran (and won) in Sengkang GRC for this election, stating that the Police investigation into Raeesah’s comments made on social media was “a serious matter which goes to the fundamental principles on which our country has been built”. On one hand, Chan Chun Sing stated in 2015, referring to SingFirst’s policies about giving allowances to the young and the elderly, “Some of them promised you $300 per month. I say, please don’t insult my residents. You think…. they are here to be bribed?” On the other hand, the PAP Government has just given out several handouts under its many budgets to help Singaporeans cope with the Covid-19 situation. [To be clear, I totally approve of these handouts. What I don’t approve is that the PAP felt the need to lambast similar policies as bribery in the past. Comparing a policy with a crime is a political low blow in my book.] I could go on, but I think I’ve made my point. And so did the electorate in this election, putting their vote where it counted to show their disdain for the heavy-handedness and double standards that the PAP has displayed for this election. Conclusion I don’t say the above to put down the PAP. The PAP would have you believe that to not support them is equivalent to not wanting what’s best for Singapore. This is a false dichotomy that must be stamped out, and I am glad to see our swing voters taking a real stand with this election. No, I say the above as a harsh but ultimately supportive letter to the PAP. As everyone can see from the results, we all still firmly believe that the PAP should be the Government. We still have faith that PAP has the leadership to take us forward and out of the Covid-19 crisis. But we also want to send the PAP a strong signal with this vote, to bring them down from their ivory towers and down to the ground. Enough with the double standards. Enough with the heavy-handedness. Singaporeans have clearly stated their desire for a more mature democracy, and that means more alternative voices in Parliament. The PAP needs to stop acting as the father who knows it all, and to start acting as the bigger brother who can work hand in hand with his alternative younger brother towards what’s best for the entire family: Singapore. There is a real chance that the PAP will not listen, though. As Lee Hsien Loong admitted in a rally in 2006, “if there are 10, 20… opposition members in Parliament… I have to spent my time thinking what is the right way to fix them”. Now, the PAP has POFMA at its disposal. It still has the supermajority in Parliament, making them able to change any law in Singapore, even the Constitution at will. We have already seen them put these tools to use for its own benefit. Let us see if the PAP will continue as it has always done, or will it take this opportunity to change itself for the better. Whatever the case, we will be watching, and we will be waiting to make our roar heard once again five years down the road. Majulah Singapura! Article Ends Here. Here's the link to the actual article: https://thethoughtexperiment.org/2020/07/11/ge2020-the-roar-of-the-swing-vote And here's the link to the other political articles I've written about Singapore: https://thethoughtexperiment.org/2020/07/07/ge2020-the-state-of-play/ https://thethoughtexperiment.org/2015/09/10/ge2015-voting-wisely/ https://thethoughtexperiment.org/2015/09/05/expectations-of-the-opposition/
Wall Street Week Ahead for the trading week beginning June 29th, 2020
Good Saturday afternoon to all of you here on StockMarket. I hope everyone on this sub made out pretty nicely in the market this past week, and is ready for the new trading week ahead. Here is everything you need to know to get you ready for the trading week beginning June 29th, 2020.
Fragile economic recovery faces first big test with June jobs report in the week ahead - (Source)
The second half of 2020 is nearly here, and now it’s up to the economy to prove that the stock market was right about a sharp comeback in growth. The first big test will be the June jobs report, out on Thursday instead of its usual Friday release due to the July 4 holiday. According to Refinitiv, economists expect 3 million jobs were created, after May’s surprise gain of 2.5 million payrolls beat forecasts by a whopping 10 million jobs. “If it’s stronger, it will suggest that the improvement is quicker, and that’s kind of what we saw in May with better retail sales, confidence was coming back a little and auto sales were better,” said Kevin Cummins, chief U.S. economist at NatWest Markets. The second quarter winds down in the week ahead as investors are hopeful about the recovery but warily eyeing rising cases of Covid-19 in a number of states. Stocks were lower for the week, as markets reacted to rising cases in Texas, Florida and other states. Investors worry about the threat to the economic rebound as those states move to curb some activities. The S&P 500 is up more than 16% so far for the second quarter, and it is down nearly 7% for the year. Friday’s losses wiped out the last of the index’s June gains. “I think the stock market is looking beyond the valley. It is expecting a V-shaped economic recovery and a solid 2021 earnings picture,” said Sam Stovall, chief investment strategist at CFRA. He expects large-cap company earnings to be up 30% next year, and small-cap profits to bounce back by 140%. “I think the second half needs to be a ‘show me’ period, proving that our optimism was justified, and we’ll need to see continued improvement in the economic data, and I think we need to see upward revisions to earnings estimates,” Stovall said. Liz Ann Sonders, chief investment strategist at Charles Schwab, said she expects the recovery will not be as smooth as some expect, particularly considering the resurgence of virus outbreaks in sunbelt states and California. “Now as I watch what’s happening I think it’s more likely to be rolling Ws,” rather than a V, she said. “It’s not just predicated on a second wave. I’m not sure we ever exited the first wave.” Even without actual state shutdowns, the virus could slow economic activity. “That doesn’t mean businesses won’t shut themselves down, or consumers won’t back down more,” she said.
In the second half of the year, the market should turn its attention to the election, but Sonders does not expect much reaction to it until after Labor Day. RealClearPolitics average of polls shows Democrat Joe Biden leading President Donald Trump by 10 percentage points, and the odds of a Democratic sweep have been rising. Biden has said he would raise corporate taxes, and some strategists say a sweep would be bad for business, due to increased regulation and higher taxes. Trump is expected to continue using tariffs, which unsettles the market, though both candidates are expected to take a tough stance on China. “If it looks like the Senate stays Republican than there’s less to worry about in terms of policy changes,” Sonders said. “I don’t think it’s ever as binary as some people think.” Stovall said a quick study shows that in the four presidential election years back to 1960, where the first quarter was negative, and the second quarter positive, stocks made gains in the second half. Those were 1960 when John Kennedy took office, 1968, when Richard Nixon won; 1980 when Ronald Reagan’s was elected to his first term; and 1992, the first win by Bill Clinton. Coincidentally, in all of those years, the opposing party gained control of the White House.
The stocks market’s strong second-quarter showing came after the Fed and Congress moved quickly to inject the economy with trillions in stimulus. That unlocked credit markets and triggered a stampede by companies to restructure or issue debt. About $2 trillion in fiscal spending was aimed at consumers and businesses, who were in sudden need of cash after the abrupt shutdown of the economy. Fed Chairman Jerome Powell and Treasury Secretary Steven Mnuchin both testify before the House Financial Services Committee Tuesday on the response to the virus. That will be important as markets look ahead to another fiscal package from Congress this summer, which is expected to provide aid to states and local governments; extend some enhanced benefits for unemployment, and provide more support for businesses. “So much of it is still so fluid. There are a bunch of fiscal items that are rolling off. There’s talk about another fiscal stimulus payment like they did last time with a $1,200 check,” said Cummins. Strategists expect Congress to bicker about the size and content of the stimulus package but ultimately come to an agreement before enhanced unemployment benefits run out at the end of July. Cummins said state budgets begin a new year July 1, and states with a critical need for funds may have to start letting workers go, as they cut expenses. The Trump administration has indicated the jobs report Thursday could help shape the fiscal package, depending on what it shows. The federal supplement to state unemployment benefits has been $600 a week, but there is opposition to extending that, and strategists expect it to be at least cut in half. The unemployment rate is expected to fall to 12.2% from 13.3% in May. Cummins said he had expected 7.2 million jobs, well above the consensus, and an unemployment rate of 11.8%. As of last week, nearly 20 million people were collecting state unemployment benefits, and millions more were collecting under a federal pandemic aid program. “The magnitude here and whether it’s 3 million or 7 million is kind of hard to handicap to begin with,” Cummins said. Economists have preferred to look at unemployment claims as a better real time read of employment, but they now say those numbers could be impacted by slow reporting or double filing. “There’s no clarity on how you define the unemployed in the Covid 19 environment,” said Chris Rupkey, chief financial economist at MUFG Union Bank. “If there’s 30 million people receiving insurance, unemployment should be above 20%.
This past week saw the following moves in the S&P:
The economy is moving in the right direction, as many economic data points are coming in substantially better than what the economists expected. From May job gains coming in more than 10 million higher than expected and retail sales soaring a record 18%, how quickly the economy is bouncing back has surprised nearly everyone. “As good as the recent economic data has been, we want to make it clear, it could still take years for the economy to fully come back,” explained LPL Financial Senior Market Strategist Ryan Detrick. “Think of it like building a house. You get all the big stuff done early, then some of the small things take so much longer to finish; I’m looking at you crown molding.” Here’s the hard truth; it might take years for all of the jobs that were lost to fully recover. In fact, during the 10 recessions since 1950, it took an average of 30 months for lost jobs to finally come back. As the LPL Chart of the Day shows, recoveries have taken much longer lately. In fact, it took four years for the jobs lost during the tech bubble recession of the early 2000s to come back and more than six years for all the jobs lost to come back after the Great Recession. Given many more jobs were lost during this recession, it could takes many years before all of them indeed come back.
The economy is going the right direction, and if there is no major second wave outbreak it could surprise to the upside. Importantly, this economic recovery will still be a long and bumpy road.
Nasdaq - Russell Spread Pulling the Rubber Band Tight
The Nasdaq has been outperforming every other US-based equity index over the last year, and nowhere has the disparity been wider than with small caps. The chart below compares the performance of the Nasdaq and Russell 2000 over the last 12 months. While the performance disparity is wide now, through last summer, the two indices were tracking each other nearly step for step. Then last fall, the Nasdaq started to steadily pull ahead before really separating itself in the bounce off the March lows. Just to illustrate how wide the gap between the two indices has become, over the last six months, the Nasdaq is up 11.9% compared to a decline of 15.8% for the Russell 2000. That's wide!
In order to put the recent performance disparity between the two indices into perspective, the chart below shows the rolling six-month performance spread between the two indices going back to 1980. With a current spread of 27.7 percentage points, the gap between the two indices hasn't been this wide since the days of the dot-com boom. Back in February 2000, the spread between the two indices widened out to more than 50 percentage points. Not only was that period extreme, but ten months before that extreme reading, the spread also widened out to more than 51 percentage points. The current spread is wide, but with two separate periods in 1999 and 2000 where the performance gap between the two indices was nearly double the current level, that was a period where the Nasdaq REALLY outperformed small caps.
To illustrate the magnitude of the Nasdaq's outperformance over the Russell 2000 from late 1998 through early 2000, the chart below shows the performance of the two indices beginning in October 1998. From that point right on through March of 2000 when the Nasdaq peaked, the Nasdaq rallied more than 200% compared to the Russell 2000 which was up a relatively meager 64%. In any other environment, a 64% gain in less than a year and a half would be excellent, but when it was under the shadow of the surging Nasdaq, it seemed like a pittance.
The US equity market made its most recent peak on June 8th. From the March 23rd low through June 8th, the average stock in the large-cap Russell 1,000 was up more than 65%! Since June 8th, the average stock in the index is down more than 11%. Below we have broken the index into deciles (10 groups of 100 stocks each) based on simple share price as of June 8th. Decile 1 (marked "Highest" in the chart) contains the 10% of stocks with the highest share prices. Decile 10 (marked "Lowest" in the chart) contains the 10% of stocks with the lowest share prices. As shown, the highest priced decile of stocks are down an average of just 4.8% since June 8th, while the lowest priced decile of stocks are down an average of 21.5%. It's pretty remarkable how performance gets weaker and weaker the lower the share price gets.
It's hard to believe that sentiment can change so fast in the market that one day investors and traders are bidding up stocks to record highs, but then the next day sell them so much that it takes the market down over 2%. That's exactly what happened not only in the last two days but also two weeks ago. While the 5% pullback from a record high back on June 10th took the Nasdaq back below its February high, this time around, the Nasdaq has been able to hold above those February highs.
In the entire history of the Nasdaq, there have only been 12 periods prior to this week where the Nasdaq closed at an all-time high on one day but dropped more than 2% the next day. Those occurrences are highlighted in the table below along with the index's performance over the following week, month, three months, six months, and one year. We have also highlighted each occurrence that followed a prior one by less than three months in gray. What immediately stands out in the table is how much gray shading there is. In other words, these types of events tend to happen in bunches, and if you count the original occurrence in each of the bunches, the only two occurrences that didn't come within three months of another occurrence (either before or after) were July 1986 and May 2017. In terms of market performance following prior occurrences, the Nasdaq's average and median returns were generally below average, but there is a pretty big caveat. While the average one-year performance was a gain of 1.0% and a decline of 23.6% on a median basis, the six occurrences that came between December 1999 and March 2000 all essentially cover the same period (which was very bad) and skew the results. Likewise, the three occurrences in the two-month stretch from late November 1998 through January 1999 where the Nasdaq saw strong gains also involves a degree of double-counting. As a result of these performances at either end of the extreme, it's hard to draw any trends from the prior occurrences except to say that they are typically followed by big moves in either direction. The only time the Nasdaq wasn't either 20% higher or lower one year later was in 1986.
In the mid-1980s the market began to evolve into a tech-driven market and the market’s focus in early summer shifted to the outlook for second quarter earnings of technology companies. Over the last three trading days of June and the first nine trading days in July, NASDAQ typically enjoys a rally. This 12-day run has been up 27 of the past 35 years with an average historical gain of 2.5%. This year the rally may have begun a day early, today and could last until on or around July 14. After the bursting of the tech bubble in 2000, NASDAQ’s mid-year rally had a spotty track record from 2002 until 2009 with three appearances and five no-shows in those years. However, it has been quite solid over the last ten years, up nine times with a single mild 0.1% loss in 2015. Last year, NASDAQ advanced a solid 4.6% during the 12-day span.
Tech Historically Leads Market Higher Until Q3 of Election Years
As of yesterday’s close DJIA was down 8.8% year-to-date. S&P 500 was down 3.5% and NASDAQ was up 12.1%. Compared to the typical election year, DJIA and S&P 500 are below historical average performance while NASDAQ is above average. However this year has not been a typical election year. Due to the covid-19, the market suffered the damage of the shortest bear market on record and a new bull market all before the first half of the year has come to an end. In the surrounding Seasonal Patten Charts of DJIA, S&P 500 and NASDAQ, we compare 2020 (as of yesterday’s close) to All Years and Election Years. This year’s performance has been plotted on the right vertical axis in each chart. This year certainly has been unlike any other however some notable observations can be made. For DJIA and S&P 500, January, February and approximately half of March have historically been weak, on average, in election years. This year the bear market ended on March 23. Following those past weak starts, DJIA and S&P 500 historically enjoyed strength lasting into September before experiencing any significant pullback followed by a nice yearend rally. NASDAQ’s election year pattern differs somewhat with six fewer years of data, but it does hint to a possible late Q3 peak.
([CLICK HERE FOR THURSDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!]())
Friday 7.3.20 Before Market Open:
([CLICK HERE FOR FRIDAY'S PRE-MARKET EARNINGS TIME & ESTIMATES!]())
Friday 7.3.20 After Market Close:
([CLICK HERE FOR FRIDAY'S AFTER-MARKET EARNINGS TIME & ESTIMATES!]())
Micron Technology, Inc. $48.49
Micron Technology, Inc. (MU) is confirmed to report earnings at approximately 4:00 PM ET on Monday, June 29, 2020. The consensus earnings estimate is $0.71 per share on revenue of $5.27 billion and the Earnings Whisper ® number is $0.70 per share. Investor sentiment going into the company's earnings release has 71% expecting an earnings beat The company's guidance was for earnings of $0.40 to $0.70 per share. Consensus estimates are for earnings to decline year-over-year by 29.00% with revenue increasing by 10.07%. Short interest has increased by 7.6% since the company's last earnings release while the stock has drifted higher by 8.0% from its open following the earnings release to be 0.9% below its 200 day moving average of $48.94. Overall earnings estimates have been revised lower since the company's last earnings release. On Thursday, June 11, 2020 there was some notable buying of 46,037 contracts of the $60.00 call expiring on Friday, July 17, 2020. Option traders are pricing in a 4.6% move on earnings and the stock has averaged a 8.4% move in recent quarters.
General Mills, Inc. (GIS) is confirmed to report earnings at approximately 7:00 AM ET on Wednesday, July 1, 2020. The consensus earnings estimate is $1.04 per share on revenue of $4.89 billion and the Earnings Whisper ® number is $1.10 per share. Investor sentiment going into the company's earnings release has 69% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 25.30% with revenue increasing by 17.50%. Short interest has decreased by 9.4% since the company's last earnings release while the stock has drifted higher by 2.7% from its open following the earnings release to be 7.8% above its 200 day moving average of $54.91. Overall earnings estimates have been revised higher since the company's last earnings release. On Wednesday, June 24, 2020 there was some notable buying of 8,573 contracts of the $60.00 call expiring on Friday, July 17, 2020. Option traders are pricing in a 6.6% move on earnings and the stock has averaged a 3.0% move in recent quarters.
FedEx Corp. (FDX) is confirmed to report earnings at approximately 4:00 PM ET on Tuesday, June 30, 2020. The consensus earnings estimate is $1.42 per share on revenue of $16.31 billion and the Earnings Whisper ® number is $1.65 per share. Investor sentiment going into the company's earnings release has 61% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 71.66% with revenue decreasing by 8.41%. Short interest has increased by 10.4% since the company's last earnings release while the stock has drifted higher by 43.9% from its open following the earnings release to be 7.6% below its 200 day moving average of $140.75. Overall earnings estimates have been revised lower since the company's last earnings release. On Thursday, June 25, 2020 there was some notable buying of 1,768 contracts of the $145.00 call expiring on Thursday, July 2, 2020. Option traders are pricing in a 4.6% move on earnings and the stock has averaged a 7.7% move in recent quarters.
Conagra Brands, Inc. (CAG) is confirmed to report earnings at approximately 7:30 AM ET on Tuesday, June 30, 2020. The consensus earnings estimate is $0.66 per share on revenue of $3.24 billion and the Earnings Whisper ® number is $0.69 per share. Investor sentiment going into the company's earnings release has 66% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 83.33% with revenue increasing by 23.99%. Short interest has decreased by 38.3% since the company's last earnings release while the stock has drifted higher by 6.3% from its open following the earnings release to be 6.4% above its 200 day moving average of $30.68. Overall earnings estimates have been revised higher since the company's last earnings release. On Thursday, June 11, 2020 there was some notable buying of 3,239 contracts of the $29.00 put expiring on Thursday, July 2, 2020. Option traders are pricing in a 4.7% move on earnings and the stock has averaged a 10.8% move in recent quarters.
Constellation Brands, Inc. (STZ) is confirmed to report earnings at approximately 7:30 AM ET on Wednesday, July 1, 2020. The consensus earnings estimate is $1.91 per share on revenue of $1.97 billion and the Earnings Whisper ® number is $2.12 per share. Investor sentiment going into the company's earnings release has 53% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 13.57% with revenue decreasing by 13.69%. Short interest has increased by 20.8% since the company's last earnings release while the stock has drifted higher by 25.2% from its open following the earnings release to be 5.2% below its 200 day moving average of $178.34. Overall earnings estimates have been revised lower since the company's last earnings release. On Tuesday, June 9, 2020 there was some notable buying of 888 contracts of the $195.00 call expiring on Friday, October 16, 2020. Option traders are pricing in a 3.1% move on earnings and the stock has averaged a 5.7% move in recent quarters.
Capri Holdings Limited (CPRI) is confirmed to report earnings at approximately 6:30 AM ET on Wednesday, July 1, 2020. The consensus earnings estimate is $0.32 per share on revenue of $1.18 billion and the Earnings Whisper ® number is $0.34 per share. Investor sentiment going into the company's earnings release has 39% expecting an earnings beat The company's guidance was for earnings of $0.68 to $0.73 per share. Consensus estimates are for earnings to decline year-over-year by 49.21% with revenue decreasing by 12.20%. Short interest has increased by 35.1% since the company's last earnings release while the stock has drifted lower by 56.7% from its open following the earnings release to be 44.0% below its 200 day moving average of $25.67. Overall earnings estimates have been revised lower since the company's last earnings release. On Thursday, June 4, 2020 there was some notable buying of 11,042 contracts of the $17.50 put expiring on Friday, August 21, 2020. Option traders are pricing in a 10.8% move on earnings and the stock has averaged a 6.7% move in recent quarters.
X Financial (XYF) is confirmed to report earnings at approximately 5:00 PM ET on Tuesday, June 30, 2020. The consensus earnings estimate is $0.09 per share. Investor sentiment going into the company's earnings release has 25% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 55.00% with revenue increasing by 763.52%. Short interest has increased by 1.0% since the company's last earnings release while the stock has drifted lower by 1.2% from its open following the earnings release to be 37.7% below its 200 day moving average of $1.47. Overall earnings estimates have been unchanged since the company's last earnings release. The stock has averaged a 4.9% move on earnings in recent quarters.
Acuity Brands, Inc. (AYI) is confirmed to report earnings at approximately 8:40 AM ET on Tuesday, June 30, 2020. The consensus earnings estimate is $1.14 per share on revenue of $809.25 million and the Earnings Whisper ® number is $1.09 per share. Investor sentiment going into the company's earnings release has 42% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 51.90% with revenue decreasing by 14.60%. Short interest has increased by 48.5% since the company's last earnings release while the stock has drifted higher by 2.4% from its open following the earnings release to be 23.4% below its 200 day moving average of $110.25. Overall earnings estimates have been revised lower since the company's last earnings release. Option traders are pricing in a 9.2% move on earnings and the stock has averaged a 8.2% move in recent quarters.
Methode Electronics, Inc. (MEI) is confirmed to report earnings at approximately 7:00 AM ET on Tuesday, June 30, 2020. The consensus earnings estimate is $0.77 per share on revenue of $211.39 million. Investor sentiment going into the company's earnings release has 45% expecting an earnings beat. Consensus estimates are for year-over-year earnings growth of 24.19% with revenue decreasing by 20.53%. Short interest has increased by 6.2% since the company's last earnings release while the stock has drifted lower by 1.7% from its open following the earnings release to be 9.0% below its 200 day moving average of $32.97. Overall earnings estimates have been revised lower since the company's last earnings release. Option traders are pricing in a 18.4% move on earnings and the stock has averaged a 8.1% move in recent quarters.
UniFirst Corporation (UNF) is confirmed to report earnings at approximately 8:00 AM ET on Wednesday, July 1, 2020. The consensus earnings estimate is $1.17 per share on revenue of $378.28 million and the Earnings Whisper ® number is $1.25 per share. Investor sentiment going into the company's earnings release has 44% expecting an earnings beat. Consensus estimates are for earnings to decline year-over-year by 52.44% with revenue decreasing by 16.63%. Short interest has decreased by 2.7% since the company's last earnings release while the stock has drifted higher by 14.1% from its open following the earnings release to be 8.4% below its 200 day moving average of $186.14. Overall earnings estimates have been revised lower since the company's last earnings release. The stock has averaged a 7.0% move on earnings in recent quarters.
Lesson 1: The Best Times to Trade Binary Options. Welcome to our new series on binary options trading for beginners, where we will take you by the hand and show you a systematic way to trade binary options.Today we will touch on the best times to trade binary options. Binary options’ trading involves trading several assets and making money based on the outcome of the price direction. A binary options is a popular financial instrument enabling traders to make a profit on the world’s top companies shares, stock indices, crude, gold…All what you need is to choose an underlying security and figure out the future price direction during a certain period, say, during next hour. Welcome to Binary Today Scam, where I go in-depth analyzing John Kane, and the Binary Today trading community. I provide my personal take on all of John’s recommendations for binary options brokers, signals and software. So let’s get started with my personal story. Introduction. First off, let me introduce myself. Binary Today 5 Provides Guaranteed 81% ITM Trading Signals Binary Today 5 is a binary options trading software for every binary trader. The system is easy to use, install and provides consistent gains with little to no risk. Download the software, plug it Binary options trading with Etrade and 24 Option are smart moves if you want top quality support. Trading apps. A growing number of people use mobile devices and tablets to enhance their trading experience. However, software for trading binary options varies hugely. So, it’s worth checking whether your broker offers cross-platform capabilities.
Today I show you some effective binary options trades with help of the Binary Strategy signal service. I use both spirit and valor signals to ensure that you see multiples strategies and trading ... Binary Options Trading My Trades Today Pauls Money Tv. Loading... Unsubscribe from Pauls Money Tv? Cancel Unsubscribe. Working... Subscribe Subscribed Unsubscribe 1.9K. Binary Today 5 is still an extremely helpful tool, and signal provider in the binary options market. Can use the signals in Forex, or binary options. In this video we provide a full sample trade ... Binary Trading Options Subsequent, you will should really decide on an expiry time or timeframe for predicting this sort of just as one hour, working day, week in addition to thirty day period ... The results of the $20 binary trading challenge in kind. LEarn how to trade binary options like a pro today. This video talks about trading psychology and a training seminar. For updates that may ...